Open topic with navigation
The sample apps are both specifically designed and written to show good Android Activity lifecycle behavior. This is, in fact, the specific focus of these applications.
globe application is a 3D graphics intensive application that shows common idioms for high performance application development, such as:
Many more details on this sample are provided in the aforementioned guide, Sample: Tegra Android Native Globe Application, which is included in this Samples Pack. Please refer to this manual for more information on the Globe application.
native_basic is a much simpler application than Globe; it handles most of the same application lifecycle cases, but its overall UI flow is much simpler than Globe, with only a “running” and an “auto-paused” mode. It is the basis for most of the feature demos. It is also the application that is generated by a call to the
app_create tool using the basic template.
The feature demos are designed to be small applications that demonstrate a particular Android, game development, or Tegra feature in isolation. These applications tend to be small, and may or may not be 100% compliant to Android application lifecycle (although the intention is that most should behave well). There are several categories of these demos, each grouped under a topic heading.
The dynamic resolution sample shows how an application can use Framebuffer Objects (FBOs) or Android native window functions to create a 3D target rendering surface that is smaller than the native screen size. This allows an application to decouple its rendering resolution from rising screen densities while still rendering to the full screen.
The native subclass sample is designed to show that while developers can create applications on Android using only native code, they can still make use of Android Java-only features simply by subclassing
NativeActivity and adding the code they need to for the specific feature. The demo uses the Android menu button to trigger one of a pair of JNI calls from native up to Java member methods in a subclass of
NativeActivity. These Java calls show an Android UI “toast” notification or launch a browser window to a URL. The JNI in this case is extremely simple, leveraging more Java code up in the subclassed activity. It is also possible via this subclassed Java to post work to the UI thread (since
native_app_glue runs in a secondary thread) and to use Android UI. This sample also forms the basis of the app create script’s subclass template.
The simple JNI sample shows how to access some basic Java classes and method purely from native code. This includes querying class APIs, creating Java class object, invoking methods and convert C strings to JNI strings. The result is that the Android menu button in this app causes JNI to launch an Android web browser to the NVIDIA developer site.
The native accelerometer shows several important sensor-related behaviors allowing the app to natively match the current device orientation while showing the current values of the accelerometer sensor visually:
|Note: Game controllers are supported on Honeycomb 3.1 and newer only. Running this sample on Gingerbread devices will cause the application to display a text warning that the OS does not support game controllers. This code is specifically designed to show one way of detecting game controller support to allow for it to be optional.|
The native game controller sample shows how to discover (at the Java level) the available game controllers, buttons and axes, and how to handle gamepad buttons and axes in native code. The NVIDIA support library,
nv_input is used to query a non-NDK Android function to add native support for analog joystick axes.
The application assumes that a USB game controller such as a PS3 SixAxis or Logitech WingMan has been attached to the devkit, tablet or phone. The Tegra-supported game controllers may be found at http://www.tegrazone.com/support/game-controller-support. The sample shows the current values of most axes and buttons visually. This sample only works on Honeycomb and Ice Cream Sandwich OSes.
Native multitouch demonstrates how to track multiple simultaneous touch points in native code. The code is able to track and display persistent touch indices that show how fingers are tracked over the course of a gesture. Every touch point includes a crosshair location indicator and a superscripted touch ID number.
NVIDIA® GameWorks™ Documentation Rev. 1.0.200605 ©2014-2020. NVIDIA Corporation. All Rights Reserved.