What Happens When Artificial Intelligence Can Read Our Emotion in Virtual Reality

Being surrounded by machines that understand our emotion is one of many ‘what ifs’ that is kind of creepy to even think about. Don’t get surprised. We will get to that future sooner or later owing to technological advances, but how?

How does a machine ‘sense’ our emotion?

At Apple’s September keynote, Apple X for the first time showed off its slick design to the world, and Apple phone lovers couldn’t help but shout “hooray!” with enthusiasm. What caught people’s eyes unexpectedly among others was Animoji, a dozen different animal emojis that mirror users’ facial expressions and that can be shared with others. Animoji seems interesting for sure, but what does it really mean for our communication in a digital world?

Nowadays, an overwhelming amount of human-to-human communications happen every second via different digital platforms, but they are quite often void of the essence of human nature: emotion. To facilitate machine-mediated communication, many tech giants are spending a great deal of time and effort on finding proper sensors that can empower digital machines to interpret our emotion. At least for smartphones, since we take pictures and talk on the phone in a daily basis, it comes naturally to engineers to use a camera (facial recognition) and microphone (virtual assistants―Siri, Google Assistant or Amazon Alexa) to ‘sense’ our emotion.

What about in VR?

Facebook Social VR

Social Virtual Reality (VR) is a new emerging digital platform that offers a virtual space where people with their avatars can interact with others. But how do we add an emotional texture to VR? That gets us to Massachusetts Institute of Technology (MIT) Media Lab.

Prev1 of 4Next

Leave a Reply

Your email address will not be published. Required fields are marked *