Would you shake your head at Adele, or turn your nose up at the Foo Fighters? Would two VIP seats at this weekend’s U2 gig at Twickenham offered on your favourite mobile ticketing platform get you nodding your head, or perhaps a couple of cheap standing tickets for Leeds Festival?
Imagine if, rather than having to use your hands to swipe, pinch or click, you could control your mobile phone simply through facial gestures?
The idea might sound far-fetched, but researchers at the Fraunhofer Institute for Computer Graphics Research IGD in Rostock, Germany have evaluated which control concepts are suitable to supplement the conventional use of mobile devices, such as hands and voice activation.
The researchers, according to the Phys.org science news website, found that EarFieldSensing (EarFS), the proprietary development that recognises facial gestures via a special ear plug, has potential and offers further development possibilities. With hands sometimes holding shopping bags, and some voice recognition services affected by language issues or social surroundings, the research team believe operation via head and face gestures, such as winking, smiling or nodding could be a useful alternative.
According to the research team, users would need to wear a special ear plug that measures the muscular currents and distortions of the ear canal which occur during facial movements.
The Phys.org report added: “The sensor detects even the smallest movements in the face through the way the shape of the ear canal changes and measures muscle currents that arise during the movement of the face or head.”