People with tetraplegia have either partially or completely lost their abilities to move their arms or legs. We want technologies to help those patients to interact with their surroundings. Fortunately, many of those patients are able to move their tongues on command. Researchers have thus asked if it is possible to help them to use their tongues to control their environment. Prior approaches use sensors inside the mouth to track tongue motion. For example, Tongue Drive uses magnetic sensors inside the mouth and a tongue piercing. Work from Scott Saponas does not require a tongue piercing but it requires infra-red sensors. We ask the question if we can detect tongue movements without having sensors inside the mouth.

TongueSee is a novel non-intrusive tongue gesture system, that does not require any surgical procedures or sensors inside the mouth. Instead, TongueSee uses EMG sensors at physiologically informed positions at the lower face and neck to detect the minute jaw movements when users move their tongues. We have built and tested a prototype that can detect six tongue gestures (left, right, up, down, protrude and rest) with accuracies above 90%. We have also designed a start tongue gesture that can significantly reduce the false positive rate of the system.

Check out the website: [Project Website]




  • Tongue Machine Interface
    Qiao Zhang, Shyamnath Gollakota, Ben Taskar, Raj P. N. Rao. Tongue Machine Interface CHI, April 2014 [PDF]

Comments are closed.