Import the native-activity application project from the first lab archive.
Edit AndroidManifest.xml. Verify that your activity is declared.
<activity android:name="android.app.NativeActivity" android:configChanges="orientation|keyboardHidden" android:label="@string/app_name" > <meta-data android:name="android.app.lib_name" android:value="library_name" /> <intent-filter> <action android:name="android.intent.action.MAIN" /> <category android:name="android.intent.category.LAUNCHER" /> </intent-filter> </activity>
Verify library_name to match the name of your shared library, without the lib prefix and the .so postfix. Make sure the application tag has android:hasCode=“false” and that minSDKVersion is at least 10.
Build and launch the application. It should start, print the log, and after a few seconds it should tell you application is not responding. This is because there are no callbacks defined for the activity.
A native activity has a main function and requires the <android/native_activity.h> header.
void ANativeActivity_onCreate(ANativeActivity* activity, void* savedState, size_t savedStateSize)
However, we will use native app glue, as it already implements all the callbacks, making it easier to write an application.
If you want to create a native application you have to register all the callbacks (onStart, onStop, onDestroy, onCreateInputQueue, etc.) and treat the events properly. This tends to add a lot of code, and potentially duplicate code. Since SDK 10 there is a library called app glue, which does all this, and it exposes events in two callbacks instead. It also adds multithreading, separating input from life-cycle events.
The activity still needs an entry point:
void android_main(struct android_app* app) { app_dummy(); // Make sure glue isn't stripped. app->userData = NULL; app->onAppCmd = handle_activity_lifecycle_events; app->onInputEvent = handle_activity_input_events; while (1) { int ident, events; struct android_poll_source* source; if ((ident=ALooper_pollAll(-1, NULL, &events, (void**)&source)) >= 0) { source->process(app, source); } } }
Notice that the app defines two callbacks, one for life-cycle events and the other for input events, and then goes into a loop waiting for events and dispatching them to the corresponding function. You can look over the app glue code in $NDK/sources/android/native_app_glue.
Add the following handler functions:
void handle_activity_lifecycle_events(struct android_app* app, int32_t cmd) { __android_log_print(ANDROID_LOG_INFO, "Native", "%d: dummy data %p", cmd, ((int*)(app->userData))); } int32_t handle_activity_input_events(struct android_app* app, AInputEvent *event) { if (AInputEvent_getType(event) == AINPUT_EVENT_TYPE_MOTION) { __android_log_print(ANDROID_LOG_INFO, "Native", "%d %d: dummy data %p", AInputEvent_getType(event), AMotionEvent_getAction(event), ((int*)(app->userData))); return 1; } return 0; }
Check that the activity runs and displays callbacks in the logs.
Native activities allow you manage the display buffer directly, although this can be problematic since there are no built-in ways of creating interfaces, as is with Java code. To initialize the window:
ANativeWindow* lWindow = app->window; //set the format of the window, 0 for default width and height ANativeWindow_setBuffersGeometry(lWindow, 0, 0, WINDOW_FORMAT_RGBA_8888);
To get the buffer you will need to call the following functions:
ANativeWindow_Buffer lWindowBuffer; ANativeWindow* lWindow = app->window; //acquire the window buffer if (ANativeWindow_lock(lWindow, &lWindowBuffer, NULL) < 0) { return; } //clear the buffer memset(lWindowBuffer.bits, 0, lWindowBuffer. stride*lWindowBuffer.height*sizeof(uint32_t));
Afterwards, you can use the buffer to draw a square:
int sqh = 400, sqw = 400; int wst = lWindowBuffer.stride/2 - sqw/2; int wed = wst + sqw; int hst = lWindowBuffer.height/2 - sqh/2; int hed = hst + sqh; for (int i = hst; i < hed; ++i) { for (int j = wst; j < wed; ++j) { ((char*)(lWindowBuffer.bits)) [(i*lWindowBuffer.stride + j)*sizeof(uint32_t)] = (char)40; ((char*)(lWindowBuffer.bits)) [(i*lWindowBuffer.stride + j)*sizeof(uint32_t) + 1] = (char)191; ((char*)(lWindowBuffer.bits)) [(i*lWindowBuffer.stride + j)*sizeof(uint32_t) + 2] = (char)140; ((char*)(lWindowBuffer.bits)) [(i*lWindowBuffer.stride + j)*sizeof(uint32_t) + 3] = (char)255; } }
Finally, you have to unlock the buffer for it to be displayed:
ANativeWindow_unlockAndPost(lWindow);
Group these bits into a function. You have to call this function when there is a demand: in the activity lifecycle handler, check for APP_CMD_INIT_WINDOW and initialize and draw the window. Also check for APP_CMD_WINDOW_REDRAW_NEEDED, and only redraw the window.
Import the opengl-native project from the second lab archive.
Look over the code. Like with an Android Java application, you have to first create an EGL context in which to draw and then call the drawing functions. Unlike the Java version, there is no renderer class, so you can call the draw functions at any time. In this case they are drawn in a loop, after processing events. It is usually advisable to limit the frame rate to 30 or 60, since otherwise the application might drain a lot of battery. The rest of the code is identical to the one used in the previous session.
To access a touch event you have to add code to the input handler function. Use AInputEvent_getType(event) to get the type of the event. What you want in this case is events of the type AINPUT_EVENT_TYPE_MOTION. To get the type of motion event, use AMotionEvent_getAction(event), and look for move events, AMOTION_EVENT_ACTION_MOVE. Memorize the position of this event:
AMotionEvent_getX(event, 0); AMotionEvent_getY(event, 0);
The last parameter is the pointer index. On multi-touch displays you can get more than one pointer, and for some gestures this is important.
Use the memorized positions to move the camera. Look for ry and rz in the renderFrame() function.
To use the sensors you must first get an instance of the Sensor manager, then ask for the type of sensor you want to use and finally define a way to identify the events, and potentially add a callback. For example, for an accelerometer:
sensorManager = ASensorManager_getInstance(); accelerometerSensor = ASensorManager_getDefaultSensor(sensorManager, ASENSOR_TYPE_ACCELEROMETER); sensorEventQueue = ASensorManager_createEventQueue(sensorManager, looper, LOOPER_ID_USER, NULL, NULL); ASensorEventQueue_enableSensor(sensorEventQueue, accelerometerSensor); ASensorEventQueue_setEventRate(sensorEventQueue, accelerometerSensor, 500);
The android_app structure contains a looper field, which you can use. LOOPER_ID_USER is the first available id when using native app glue, the ones smaller than LOOPER_ID_USER are used internally by the library.
To get an accelerometer event, after processing the event, in case the source is not NULL, check the value returned by ALooper_pollAll (or ALooper_pollOne). This should match the identifier you set in ASensorManager_createEventQueue. If it does match, use:
ASensorEventQueue_getEvents(sensorEventQueue, event, num_events)
Event should be an ASensorEvent structure if num_events is 1 or an ASensorEvent array, if num_events is more than 1. ASensorEventQueue_getEvents returns the number of events it read. Pass to this function a reference to the event. The data read from the accelerometer will be in the accelerometer structure inside the event (hint: acceleration). Use this data to rotate the camera on the rx direction.