In my opinion, one of the largest challenges for interactive wearable devices of all sorts, but especially those for Health Tech applications, is the user interface. Touch screens are wonderful, but when you want to make a display as small as possible, it does not give the user much real estate to touch. This creates challenges for those with fat fingers (figuratively and literally) and poor fine motor control (such as the elderly). Speech recognition is certainly a viable alternative. Many people think gestures are the answer, but the problem with no-touch technology is that you have no tactile feedback to confirm that you’ve engaged with the interface.
At least, not until now. At CES 2015, I found a new company from England called Ultrahaptics, and it has created an exceedingly clever technology. Using a matrix of ultrasound emitters, their system senses the location of your hand, and also projects tactile cues into space where you can feel them as you gesture. The intensity of the feedback can be varied, so that you can feel virtual shapes and textures in space. (The process is invisible; the photo at the top is just a representation of what the experience can feel like.) You can feel it as you move objects or “touch” different regions to interact with an application running on a device.
Having tried it in person, I can report that it’s a bit disconcerting at first because it’s so unlike anything else we normally experience. (Maybe putting your hand over a jet of blowing air comes closest to how it feels.) The system is still in its early stages of development; the company has an evaluation program for companies that might want to incorporate the technology. If it becomes sufficiently miniaturized and power efficient, I can see how you might have a sensor patch mounted on your wrist or some other spot. You could then issue commands to your wearable systems with gestures, receiving tactile feedback along with audible or visible confirmation. The possibilities are exciting.
Trackbacks/Pingbacks