HandMorph, a wearable exoskeleton for adults that simulates a child’s hand, is the latest project of University of Chicago computer science postdoctoral fellow Jun Nishida, who investigates human-computer interactions by creating novel devices that change human perception.
The device is a wearable passive exoskeleton with no electronics. The user puts it on like a glove. In the palm of the glove is a smaller rubber hand in a skeleton of linkages that translate the user’s finger movements into movements of the rubber hand’s fingers. The user can then “pilot” the smaller hand as if it were their own.
HandMorph follows Nishida’s previous project, a wearable headset that allows the user to experience sight from the height of a child. In an interview with The Maroon, Nishida said that the purpose of both projects is to investigate human perception. So far, his experiments have revealed that changing a user’s visual perspective changes their perception of their own height and that changing a user’s grasp capacity changes their perception of size. By changing the sensory inputs of designers—such as product designers, architects, and user interface engineers—Nishida hopes to increase the degree to which they can understand the experiences of a child.
“The main contribution of my research is interaction design rather than new engineering techniques,” Nishida said. “Making an exoskeleton is kind of new engineering, but more important is how to use these techniques.”
In an experiment using HandMorph, participants were given a toy trumpet designed for children and challenged to improve its design. To aid in that process, each participant was given a HandMorph device and a fact sheet containing measurements of an average child’s hand. The study found that designers who primarily used HandMorph were both more confident and better able to produce designs with fewer flaws.
Creating HandMorph presented cross-disciplinary challenges. “This is a computer science project, but it’s about changing [the] sensory and physical functions of our bodies. We wanted to know what happens in our cognition, so I have to work often with psychologists,” Nishida said. He also had to analyze bodily kinematics—the physics of how the body moves—when creating the exoskeleton and employ motion capture to map finger motions. In creating HandMorph, Nishida worked with psychologists and neuroscientists at UChicago. The paper based on HandMorph won the Best Paper award at the 2020 Association for Computing Machinery (ACM) Symposium on User Interface Software and Technology (UIST) in October.
Nishida hoped to make HandMorph a device that would allow people to directly experience the sensations of others. “It’s a very human-centric design approach. The wearers should feel that these controls are part of their body,” he said.