Teenage freshman at Stanford has a project that uses Google Glass to help people with autism to read another person’s emotions. Autism is characterized by deficiencies in social interaction, emotional and non-verbal communication.
According to a report in Medical Daily:
Imagine, if you will, not having the ability to read a person’s emotions — not being able to see upturned lips and immediately register, smile. To your emotive self, that possibility seems bleak at best. But it’s one of the principal challenges facing people with autism spectrum disorder (ASD). And it’s a challenge that one Stanford University student has decided to overcome with the help of Google’s latest technology, Google Glass.
A lot of people have much to gain from Google Glass, as the possibilities of the retro-futuristic eyewear seem almost limitless. Catalin Voss, an 18-year-old from Heidelberg, Germany is testing those limits in a new project called Sension. Co-founded with Jonathan Yan during the pair’s freshman year at Stanford, Sension will allow people with ASD to lock-in on a person’s face and let Google Glass determine the person’s emotion via the device’s webcam.
Eye-tracking technology to diagnose autism isn’t a new procedure. Doctors frequently test a new baby’s gaze to see how fast it responds and if it can comfortably settle on another person’s eye regions — a point of difficulty for many with ASD.
Catalin Voss is a pretty impressive dude in most regards.
Autism is a growing issue. This is one implementation of Google Glass that we can thoroughly endorse. Watch for this Voss guy in the future.