November 23, 2024
Google’s Project Gameface to Let Android Users Control Devices Hands-Free
Google has expanded its Project Gameface, an open-source project aimed at making tech devices more accessible, to Android, and now it can be used to control the smartphone interface. The project was first introduced during Google I/O 2023 as a hands-free gaming mouse that can be controlled using head movements and facial expressions. These were designed for those who ...

Google has expanded its Project Gameface, an open-source project aimed at making tech devices more accessible, to Android, and now it can be used to control the smartphone interface. The project was first introduced during Google I/O 2023 as a hands-free gaming mouse that can be controlled using head movements and facial expressions. These were designed for those who suffer from physical disabilities and cannot use their hands or voice to control devices. Keeping the functioning same, the Android version adds a virtual cursor to allow users to control their device without touching it.

In an announcement made on its developer-focused blog post, Google said, “We’re open-sourcing more code for Project Gameface to help developers build Android applications to make every Android device more accessible. Through the device’s camera, it seamlessly tracks facial expressions and head movements, translating them into intuitive and personalised control.” Further, the company asked developers to use the tools to add accessibility features to their apps as well.

Project Gameface collaborated with the Indian organisation Incluzza which supports people with disability. Using the collaboration, the project learned how its technologies can be expanded to different use cases such as typing a message, looking for jobs, and more. It used MediaPipe’s Face Landmarks Detection API and Android’s accessibility service to create a new virtual cursor for Android devices. The cursor moves following the user’s head movement after tracking it using the front camera.

The API recognises 52 facial gestures including raising an eyebrow, opening the mouth, moving the lips, and more. These 52 movements are used to control and map a wide range of functions on the Android device. One interesting feature is dragging. Users can use this to swipe the home screen. To create a drag effect, users will have to define a start and end point. It can be something like opening the mouth and moving the head, and once the endpoint is reached, closing the mouth again.

Notably, while this technology has been made available on GitHub, it is now up to developers to build apps using this option to make it more accessible to users. Apple also recently introduced a new feature that uses eye-tracking to control the iPhone.


Affiliate links may be automatically generated – see our ethics statement for details.