Virtual reality (VR) devices can provide you with an immersive experience that is different from your immediate surroundings. One problem is to find ways to interact with others, especially within a world that is computer generated. Researchers at the University of Southern California and Oculus (the VR goggle maker recently acquired by Facebook) are working on a solution. As reported by MIT Technology Review, they have developed a system that reads your facial expressions and displays them on a three-dimensional avatar withing a computer generated setting.
The system uses a 3D camera to track the mouth and lower facial features. Strain gauges track facial movements for the eyes and forehead. This data is combined to create facial expressions that can be applied to a human avatar or other character.
In addition to gaming use, the system is could have applications for social applications, or even collaborating with others in a work setting. I find it particularly interesting as a possible tool for tele-medicine, especially for some forms of therapy. The patient and healthcare professional could “meet” in some virtual reality setting, and the facial expressions could encourage a more natural exchange in conversation. The immersive experience could help the healthcare person focus the discussion on the intended topics, and possibly use the setting to share information designed to help the patient understand the health condition and treatment.