Robotics in various forms has multiple applications in medicine and healthcare. Point of view (POV) is a helpful starting point in assessing potential uses for robotics. For example when a physician uses a workstation to insert a stent using robotic arms in a remote location, as with the CorPath GRX System, the application assumes a surgeon’s point of view. Wearable robotics assisting with movement in rehabilitation or providing powered prosthetics work with the patient’s POV.
Engineers and researchers at The University of Tokyo and Keio University are developing a remote robotic work system called Fusion that employs the point of view of a collaborator, perhaps better stated as a “shared point of view.” Fusion is a wearable robotic system that is controlled by a remote operator. The wearable consists of a backpack with three extensions: a camera and two robotic arms. The operator uses an Oculus Rift virtual reality (VR) head-mounted display to access the wearer’s system. Handheld controllers direct the movements of the robotic arms. The operator can use the robotic arms in three modes: directed, enforced, and induced. In directed mode, the operator uses mechanical hands at the end of the arms to accomplish tasks such as working with the wearer by adding two more hands. This mode can also be used to demonstrate or to perform tasks that are beyond the wearer’s capabilities. In enforced mode, the robotic arms attach to wristbands to move the wearer’s arms; this mode could be helpful in physical therapy rehab or to augment the wearer’s abilities. The induced mode also uses wristbands, but in this case to pull the wearer’s arms to induce walking.
The next steps for the Fusion development team include developing a platform for remote collaborative work and skill learning. Without indulging in science-fantasy, it also may be possible to design self-directed VR robotic assistance to learn and practice new skills, and maybe even help people perform all sorts of new tasks that they could never do unassisted.