Wearable technology comes in a variety of shapes, sizes, and levels of complexity. Perhaps the most complex are the powered prosthetic limbs that aim to replace the functions of human hands. We’ve written about other projects (such as the open-source HACKberry hand) in the past, but one of the key problems remains: how to control these devices.
Touch Bionics is a company that has developed a mechanical prosthetic hand and fingers, but more than that, they have also created a powerful user interface. Their i-limb prosthetics rely on not one, but four different interface technologies in order to let the user control the hand. Perhaps the most innovative feature is that the user can make gestures with the hand which will trigger pre-programmed gripping actions or other responses. This means that the user can interact with objects in a more normal manner. Another unusual way to trigger an action is the use of a “grip chip.” This is a small token that you can put on or near an object, and proximity sensors will detect when the hand comes close to the chip. This will then cause the hand to respond with a pre-defined movement. The hand can also respond to muscle-firing inputs (EMG) to trigger motions. Finally, the users can select a defined movement from a menu of choices on a smartphone to issue a command.
This multi-modal user interface will help give the wearer better control with more convenience across a wider range of daily activities. Rather than rely on a single solution, this approach gives the user more choices which should make the device more useful.