Google is bringing Project Gameface, a project announced at Google I/O 2023 to Android. Project Gameface is an open-source gaming mouse that allows users to use facial expressions and head movements to control a cursor. When Project Gameface launched, it was limited to the Desktop.
Now, Google is expanding the number of supported devices by bringing this technology to Android. Gameface on Android features a new virtual cursor that is programmed to move according to the user’s head movement.
Project Gameface was originally made to help disabled users play games
Project Gameface was launched as a gaming mouse and is aimed at making computer games more accessible to people with disabilities. With this technology, users can control the cursor through mouth movements and perform click-and-drag actions by raising their eyebrows. Screenshots from Google show that users can set custom mouse bindings to different facial gestures.
Google says they used the Android Accessibility API to bring Gameface to life on Android. The Android Accessibility API supports different gesture events like GESTURE_SWIPE_UP_AND_RIGHT, GLOBAL_ACTION_BACK, and more. Gameface currently supports a handful of events from the Accessibility API like GLOBAL_ACTION_HOME and GLOBAL_ACTION_BACK.
The Android version offers similar functionality to its desktop counterpart
Since Gameface interprets the user’s facial expression into valid mouse movements, a clear camera feed is needed. On Android, Gameface will tell the user when the camera is being used, by showing an overlay of the camera feed. Google says that this camera feed will always be available even in the Android settings.
Just like the desktop version, Gameface on Android allows users to customize their experience by mapping different facial expressions to preferred inputs. Users can choose different facial expressions for common Android navigation tasks like selecting an item, going to the previous screen, or opening notifications.
To give users even more control, Project Gameface allows users to perform a ‘drag function’. This ‘drag function’ will allow users to perform even more complex tasks like swiping in different directions.