Open topic with navigation
If your application only supports controllers as input and can be downloaded on touch devices (not a TV set-top box scenario), be sure the user knows up front in the application’s listing. It should be bold and immediate. There should also be a reminder when the application loads if a controller is not connected. Keep in mind that the user may not have their controller with them, so allow all the system keys to function (i.e. Home, Back).
In the sections below, controller disconnects and how to handle them are discussed with an engineering slant. The user experience should also be considered.
It may seem that disconnects are rare, since we generally think in terms of a tablet or phone and a controller; a scenario where the user is active with the device. This thinking ignores an important category, the micro-console, where the user is more likely to leave the controller inactive for a time, thus having it disconnect from the device. In the tablet, phone, or micro-console case, the user experience should be the same.
Any time a disconnect is determined, the game should automatically pause and wait for user input directed by the application’s needs. Something as simple as, “Please reconnect your controller,” or, in a multiplayer case, “Player 1 please reconnect your controller,” followed by “Player 2” and so on, as needed. After reconnection, the player should be in the paused state, allowed to un-pause, and continue.
It is impossible to account for all the different controllers out there in one simple specification. NVIDIA feels with the above specification, which also includes the Android specification, we have balanced simplicity with broad support.
For the absolute broadest, future-proof support, you could add a Controller Setup Screen. This screen, usually in your Options area, would list off all application actions such as jump, select, next page, and shoot. A user could then select a listed application action and assign it to something from the controller. For instance, the user presses the shoot action; the game waits for an input from the controller, and then the user presses A. This would assign A to shoot.
With wide adoption of the Android specification, adding a setup screen should not be considered essential while using Android APIs or Unity v4.3 or greater.
In general, specific controller tuning should not be used. If there is some controller(s) to which you’d like to tune your application’s experience (for instance, it has 6 action buttons), this can be done by using the controller’s device name. To get the controller’s name, in Unity you can call
Input.GetJoystickNames() and while using the Android APIs call
InputDevice.getName(). This can be compared against known names, and proper branching for that tuned experience can happen. Most applications do not, and should not, tune to a specific controller.
There is one instance where tuning to a specific controller is warranted. Using the most common addition to a controller as an example, a mouse or mouse-like input, in our experience, shows that the quality of input can vary greatly from one device to another. If your application will use mouse (or any additional input type), consider defaulting to a generalized mouse input handler with the addition of specifically-tuned handlers per input device as needed. Full support of those other inputs is beyond the scope of this document, but unless your application is enhanced by, for instance, mouse input, we recommend ignoring those events.
As in the specification notes above, if a
MotionEvent is received with values in either
AXIS_Y and it is indicated to Android that the event has not been handled (
return false), an equivalent DPAD event will be generated.
Every controller is different, some are very high precision, some are low precision, and some even have filters where axis values are pulled toward the orthogonal. The list of differences is unending. Given all the differences, we recommend not using the automatic
DPAD generation that Android offers, and instead use the axes directly, if possible. This will give you the ability to tune the user experience on the LS.
One important aspect of tuning the LS for “DPAD-like” movement is; when does the application register a particular U, D, L, or R value? Consider a user in the Main Menu of your application, flicking the LS to move focus throughout the menu. Further consider a tight area around the LS’s dead zone when it’s in its neutral position.
Tests using a high-precision controller showed that “flicks” in any one direction, using the tight area as the deciding factor for UI focus movement, produced unreliable results for the user. In short, the user, at that short of a distance for stick movement, tends to push slightly up for left to right movement before they settle into being left or right. Similar results show up for moving the stick up and down. However, if you loosen the “deciding” area, results are more in line with the user’s expectations.
Testing with low precision and filtered controllers did not show this issue. Since these controllers don’t have the fine granularity of a high precision controller, especially at that tight a space, they didn’t exhibit the same problem.
This “loosening” of the “deciding” area should only be used when “DPAD-like” movement is needed on the analog sticks. Take care in not extending the area out too far in the gameplay portions of your application since it can feel sluggish. Use the minimal distance necessary.
This information is to help your mental picture of input on Android. Since Google introduced “Project Butter,” the project to make the user experience “buttery” smooth, the Android frameworks hold input events until Vsync. This helps reduce stutter, but can also introduce latency. In our testing, it was up to 16ms, as of this writing.
NVIDIA® GameWorks™ Documentation Rev. 1.0.200608 ©2014-2020. NVIDIA Corporation. All Rights Reserved.