There's no end to the illusions that can be created by your smartphone or tablet. Swiping a finger across a screen can turn the pages of a book, strum a chord on a guitar or guide a harried explorer through treacherous jungle ruins. But no matter what appears to be happening visually and aurally, laments Katherine Kuchenbecker, "it always feels like a piece of glass."
Kuchenbecker, an assistant professor in UPenn's Mechanical Engineering and Applied Mechanics Department, is working to change that. Pick up a stylus in her lab and start scribbling in a drawing program — the texture feels like pencil on paper, not plastic on glass. Switch your virtual pencil to a virtual pen and the texture changes accordingly. Before long, she says, you won't need the stylus — start punching buttons on your phone and you'll feel their rounded contours, not the flat surface that's actually in front of you.
Part of Penn's General Robotics, Automation, Sensing and Perception (GRASP) Lab (most famous lately for those robocopters flying around a TED conference), Kuchenbecker directs a group focused on haptics, the science of touch and its relation to human-computer and human-machine interaction. Applications range from simulators for training surgeons to remote telerobotics systems for bomb defusal to corrective physical-therapy devices. Students in her haptics class have also designed more entertaining variants, like a vest that approximates the sensation of being shot or burned while playing a first-person shooter, or a wrist cuff that prods your arm when your golf swing goes astray.
"Humans are inherently multisensory creatures," Kuchenbecker explains. "You look with your eyes, you listen with your ears, you taste and smell and you touch. I think the sense of touch is underrated, and not understood as well as vision and hearing and the other senses, but it was the first sense to evolve. The very first organisms that evolved could tell if they were bumping into something and needed to move in a different direction long before they had eyes or ears."
Much of the inspiration for Kuchenbecker's interest in robotic teleoperation systems came from her father, a surgeon who learned his craft the old-fashioned way — by operating on real patients. The tools that she and her Haptics Group are working on would allow novice surgeons to practice under less high-pressure circumstances and a skilled surgeons to carry out delicate operations less invasively.
"The problem with that is that you're operating with long instruments that are like chopsticks, basically," she says. "It's really hard to be dexterous, to make the same kind of graceful, complex manipulation movements that you would normally do during open surgery. Robotic-surgery technology connects your hands, via fancy joysticks, with the movements of long, thin robotic instruments so you can do complicated surgeries without needing huge incisions, but they don't give you any sense of touch. It's as though your hands are numb. My group has developed a new approach to letting you feel what the robotic tools are touching."
Kuchenbecker's group also works with autonomous robots, utilizing cutting-edge sensors to design robots that are more capable with their hands, able to manipulate objects with the agility and adaptability that humans take for granted. "Right now robots are used a lot in factories, in very controlled settings," she says. "But to operate in a home environment or anywhere they're interacting with humans and things change from day to day, the robot needs to be able to react and pick up an object even though it's never seen that object before, it doesn't know how heavy it is, doesn't know if it's slippery or sticky, hard or soft, rough or smooth. We're working on technology to help the robot estimate what the object will feel like."
The key is haptography, or haptic photography. "We record what a person feels as they touch the real object, either clicking a real button or dragging a tool across a real texture, and use a computer program we've designed to analyze it and distill it out into the essence of that touch experience," Kuchenbecker explains.
"I call it a haptograph. And when you come and touch the screen of the computer, we play it, but it's not the same experience every time. If I push harder or I push softer, or move faster or slower, we have to generate the right sensation. If we played the same vibration all the time, we would break the illusion."
Kuchenbecker seems equally thrilled by all of haptography's implications. "I delight in working on projects that might have the capability to improve people's daily life, whether that's through making a game more fun or helping them recover from an injury faster, helping them learn to do surgery better or to deliver better care to a patient in the operating room."