Emotion detection with Google Glass
Google Glass is a wearable computer, shaped like glasses. This product has been in development for quite some time now by Google. The release hasn’t been planned yet, but rumours say it might be released in the last quarter of 2014. Though the Google Glass is not on the market just yet, several applications (Glassware) have been released. One of the latest, developed by Fraunhofer Institute in Germany, is a real-time emotion detecting application. This application performs analysis without using the cloud, and keeps all the data safely stored on the device. Besides analysing and detecting emotion, the application can also predict someone’s age and gender.
The technology used by the emotion detector is based on the SHORE system. The SHORE system stands for Sophisticated Highspeed Object Recognition Engine. This system started as an object-detection computer vision system, and has been implemented in several pc’s and tablets. The video below shows how this system is used in Google Glass:
Possiblity’s for this technology seem endless. For example, it could be a solution for people with disorders like autism, who have trouble reading emotions and interpreting facial expressions. With this technology they would be able to tell what emotion the other person is projecting and adjust their responses accordingly. This could also be of major assistance to people with sensory processing disorders, like face blindness. Emotion and face detection could assist them in knowing who they are speaking to. It would also assist intercultural communication, because it would bridge language barriers in expressing emotions. This could be, once refined to different cultures around the world, a supporting piece of technology for politicians or aid workers in foreign countries.
Even though this technology seems promising, there are a few risks associated with it. The most important issue is probably the privacy issue. This app has the ability to release the data about who people are talking to, and the way they are feeling. Though the Fraunhofer Institute chose to not implement these abilities in this application, these are concerns that the public should be aware of. Another concern is the interpreting of the emotions. For example, when you are walking on the street, and you are afraid of being mugged. When you see a person, and your Google Glass signals that this person is angry, or looking at you aggressively, it might make this person seem like a mugger. This interpreting of emotions, without any context (Why is this person angry? Is the aggression even directed towards you?) could be a dangerous development.
Emotion detection is an application that could improve several aspects of your life, and assist people with disorders, but should not be used mindlessly. How do you feel about developments like these? Would you for example use this technology when it becomes available to the public? Would it be pleasant or unpleasant to always know how the other person is feeling? Or if the people around you would always know how you feel?
You can also try it for yourself: http://face.sightcorp.com/demo_analysis_display/