Researchers continue to push skin sensor technology development. We’ve written about skin sensor technology from many angles including drawing sensors on the skin with ink, 3D sensor printing directly on the skin, energy-harvesting skin patches, haptic artificial skin that aids rehab, and much more.

Rather than collect data, a new project is designed to help people communicate. MIT Media Lab researchers recently published a paper in Nature Biomedical Engineering that describes their work on skin sensor technology that helps people with amyotrophic lateral sclerosis (ALS) communicate after they have lost control of the muscles used for speech.

The MIT team designed a stretchable sensor that can detect facial expressions such as a smile, open mouth, and pursed lips. The device algorithms interpret the changes using four aluminum nitride piezoelectric sensors embedded in silicone film. The sensors convert skin movement or deformation into measurable electric voltage changes. These changes are analyzed by an algorithm on a handheld device that translates them using a library of phrases or words. In testing, the MIT scientists were able to train the sensor to distinguish facial movement and expressions with 87% accuracy with healthy subjects and 75% accuracy with two ALS patients. The skin sensor tracks strain on facial skin continuously. Continuous monitoring gives the skin sensor device an advantage over other solutions that measure the electrical activity of facial nerves using bulky equipment.

The next steps include further device testing with additional ALS patients and include determining whether the technology also could help track disease progression or treatment effectiveness. The MIT researchers have filed for a patent on this technology.