Screenshot from Tegra Tablet Screenshot from Tegra Tablet Screenshot from Tegra Tablet Screenshot from Tegra Tablet Samsung Galaxy S3 Samsung Galaxy S3 Samsung Galaxy S3 Samsung Galaxy S3 PC version. PC version. PC version. PS3 version.

Team Project - RollingWave - PC / PS3 / ANDROID.


RollingWave - One of the module at Msc Game Engineering, Newcastle University was to develop game in group, based on Katamari Damacy on three platforms namely - PC, PS3, and Android in 5 weeks. It was made using cross complied graphics and game physics engine and audio sytem from ground up using C++, OpenGL, OpenAL and CUDA. The main object was the project was for all the three platform to use the same code base and cross compile.

ANDROID

I was incharge of the Android version of the game, It was made totally in C++ using the NDK toolkit and was targeted at Nvidia Tegra devices using the alpha version of NVIDIA Nsight Tegra, plugin for Microsoft Visual Studio. Althouht the main objective of cross compilation and code share among the three platform was acheived. But the major roadblock using the NDK and NIVIDIA Tegra plugin, textured could be loaded and used in the program. At that point of time, it didnt support loading textures and storing them as a GLuint using any 3rd party library like SOIL et al. Hence, a tonned down version of the game with the main focus on cross-compatability was produced.

PC

A full fledged 5 level game based on Katamari Damacy was produced using C++, OpenGL and OpenAL. Particle system was also used based on CUDA and support for Microsoft Xbox controller.

PS3

A 4-player multiplayer version of the game was produced, again utiliing the same code base and code compatability.

Android - Descripion :

The task was to create Android application with pure C++ code so as to use the same code across the three platforms. And as we were asked to develop for Nvidia Tegra devices – I used the Nvidia Tegra Visual Studio Plug-in (https://developer.nvidia.com/content/nsight-tegra-visual-studio-edition-111-available) Which was fairly nascent at the time of the project.


The following sections describe the various aspect of the application development to create an Open GLES 2.0 context and using JNI/NDK to use native code.


Render Engine:

The main aspect of the rendering engine was to handling of Android lifecycle events and its interaction between Java and native code. There are two basic parts to the Java/native JNI code interaction. Most of this interaction is handled by the Android support API, NativeActivity, which is in turm wrapped away from the application by NVIDIA’s version of the Google sample framework nv_native_app_glue It takes care of the event handling and all he life cycle events down to the native code of the app’s process.

NvEGLUtil* NvEGLUtil::create(ConfigChooser chooser)
{
	NvEGLUtil* thiz = new NvEGLUtil;

    thiz->m_display = eglGetDisplay(EGL_DEFAULT_DISPLAY);
	if (thiz->m_display != EGL_NO_DISPLAY)
	{
		EGL_STATUS_LOG("eglGetDisplay");
	}
	else
	{
		EGL_ERROR_LOG("eglGetDisplay");
		delete thiz;
		return NULL;
	}

    if (eglInitialize(thiz->m_display, 0, 0))
	{
		EGL_STATUS_LOG("eglInitialize");
	}
	else
	{
		EGL_ERROR_LOG("eglInitialize");
		delete thiz;
		return NULL;
	}

	if (chooser(thiz->m_display, thiz->m_config))
	{
		EGL_STATUS_LOG("Config chooser");
	}
	else
	{
		EGL_ERROR_LOG("Config chooser");
		delete thiz;
		return NULL;
	}

	thiz->m_status = NV_INITIALIZED;

	LOGD("****  EGL context created and returning the pointer.");
	return thiz;
}
                        

Libs and Linking:

Each library is its own module in the Android NDK build system. The libraries have each their own static library module with a unique name. The application here imports various libraries using the Android.mk file :

  1. Add the Library name to LOCAL_STATIC_LIBRARIES
  2. Add the path to the libs/jni tree as an import library path
  3. Import the library module itself
  4. LOCAL_LDLIBS :=  -lstdc++ -lc -lm -llog -landroid -ldl -lGLESv2 -lEGL
    LOCAL_STATIC_LIBRARIES := nv_and_util nv_egl_util nv_bitfont nv_math nv_glesutil nv_hhdds nv_log nv_shader nv_file nv_thread SOIL2Android nclgl-Android
    ...
    $(call import-add-path, libs/jni)
    ...
    $(call import-module,nclgl-Android)
                                    

Why no textures and problems with mesh loading :

I have been using the SOIL library ( http://www.lonesock.net/soil.html ) in order to load any textures or image and store it in GLuint for OpenGL application. But while using it with the tegra framework it failed to load or read the files. Alternatively, I also tried building SOIL as a separate static library with the Tegra framework as the base but it still didn’t load any files properly.


The same issue persisted with mesh loading – as I couldn’t get the framework to read any of the files properly and even if they did something along the way it wasn’t working. With only 6 weeks for the project to cross compile the whole code base. It was decided that Android version would be made without any textures and in order to load a mesh (in this a ball). The obj file was imported into Blender and exported as a C-header file (with the help of a script) and the result were buffered into the VBOs.


unsigned int vertex_count[]={240};
 struct vertex_struct {
 
float x,y,z;
float nx,ny,nz;
float u,v;
 
};
 
struct vertex_struct Mvertices[]={
      /* ICO: 240 vertices */
 {0.262929f, -0.525738f, 0.809005f, 0.471318f, -0.661688f, 0.583119f, 0.000000f, 0.000000f},
 {0.425382f, -0.850654f, 0.309004f, 0.471318f, -0.661688f, 0.583119f, 0.000000f, 0.000000f},
  ...    
                        

Camera Control :

With the help of the “handleInput” function in the engine and MotionEvent of the Android app_glue framework by Google. The screen was divided to have simple camera and movement controls.

int Engine::handleInput(AInputEvent* event)
{

	int32_t eventType = AInputEvent_getType(event);

	if (eventType == AINPUT_EVENT_TYPE_MOTION)
	{	
		int32_t action = AMOTION_EVENT_ACTION_MASK & AMotionEvent_getAction((const AInputEvent*)event);
		int32_t actionUnmasked = AMotionEvent_getAction(event);

		float pX = AMotionEvent_getX(event, 0);
		float pY = AMotionEvent_getY(event, 0);
		/*Debug */
		//LOGD("X COORD : %f", pX);
		//LOGD("Y COORD : %f", pY);

		if((pX > (mEgl.getWidth()/2 + 100)) && (pX < (mEgl.getWidth()/2 - 100)) && (pY < 100)){

			triangleApp->toggleCamera = !(triangleApp->toggleCamera);
			LOGD("TOUCH");
			//triangleApp->GetPhysicsSystem().GetCamera()->SetPosition(Vector3(2065, 2170, 1897));
			//triangleApp->GetPhysicsSystem().GetCamera()->SetYaw(1.05);
			//triangleApp->GetPhysicsSystem().GetCamera()->SetPitch(-26);


		}

		/*Split To Left and Right Screen*/
		/*First Block - Move camera up and down. */
		if (pY < 200) {
			if (pX > (mEgl.getWidth()/2)) {
				triangleApp->camera->SetPosition(triangleApp->camera->GetPosition() + Vector3 (0,10,0));
				triangleApp->viewMatrix = triangleApp->camera->BuildViewMatrix();		
				return 0;
			} else {
				triangleApp->camera->SetPosition(triangleApp->camera->GetPosition() - Vector3 (0,10,0));
				triangleApp->viewMatrix = triangleApp->camera->BuildViewMatrix();		
				return 0;
			}
		/*Panning 360 - Right Side of the screen.*/
		} else if (pX > (mEgl.getWidth()/2)) {
			if (action == AMOTION_EVENT_ACTION_DOWN) {
				LOGD("Engine : View Started");
				camYaw = pX;
				camPitch = pY;				
				return 0; 
			} else {
				float movedX = pX - camYaw;
				float movedY = pY - camPitch;
				movedX = movedX / 50;
				movedY = movedY / 50;
...
...
                        

Shader: (GLSL-ES Shaders):

The tegra device supports OpenGL ES 2.0 and its shading language, GLSL-ES. Basically a subset of desktop GLSL, GLSL-ES removes all the fixed-function language constructs, and also removes language constructs for GL features that are not a part of OpenGL ES 2.0 core, such as 1D and 3D textures.


Loading Shaders - Shaders are loaded using the OpenGL ES standard functions:
glShaderSource,
glCompileShader, and
glLinkProgram.