Johns Hopkins Magazine
Johns Hopkins Magazine Current Issue Past Issues Search Get In Touch
   
More Than a Feeling

Allison Okamura's engineering lab infuses robots with a sense of touch, for the sake of health.

By Michael Anft
Illustration by Mike Ciesielski


When he performs a life-saving operation, David Yuh puts his head inside a console, views a screen with a three-dimensional image of the heart, and moves a computerized "wrist" that sends commands to a robotic arm inside the patient. A scalpel at the end of that arm, under Yuh's command, will repair damage to the mitral valve and other areas of the heart.

Yuh, an associate professor of surgery and director of Johns Hopkins Medicine's minimally invasive cardiac surgery unit, can quickly rattle off the advantages of using the daVinci machine, a robot designed to assist surgeons who perform delicate procedures on the heart and prostate. Besides allowing him to move in small spaces, the robot eliminates the need to make a large incision and crack and cut through the sternum to work on the heart, Yuh says. Instead of doing such drastic damage to the body, the daVinci enables Yuh to thread a robotic arm through a small incision — about 6 to 8 centimeters. As a result, patients bleed less, recover much more quickly, and spend fewer days in a costly hospital bed than if they had been fully opened up. The daVinci also gives Yuh and other surgeons the ability to make minute movements, such as tying sutures or removing bits of damaged or abnormal tissue, that would be more chancy if done by hand alone.

But there's one problem: Though the daVinci is outfitted with a camera arm so that surgeons can see what they are doing, they can't feel what's happening inside a patient's body. "It's like eating without your sense of smell," Yuh says. "You have to adapt to that."

Lacking the critical sense of touch when using the robot, surgeons tend to move more slowly, to worry more about breaking sutures or injuring healthy tissue — both of which can lengthen an operation and put heart patients at greater risk of bleeding, heart failure, and stroke. "The more time a surgeon spends being cautious, the more dangerous it is for the patient," Yuh says. "If a surgeon breaks a suture or inadvertently damages tissue, he has to take time to repair it. That can mean worse outcomes for patients."

According to a study concluded late last year and submitted to The Journal of Thoracic and Cardiovascular Surgery, doctors new to the daVinci would operate more safely if the machine were able to deliver sensory feedback — if it could replicate to some degree what surgeons would feel if they had their hands inside a patient. Doctors could move more confidently and comfortably, the study's authors say, and surgeons new to the daVinci could learn to use it more quickly.

This is where Allison M. Okamura comes in. An associate professor of mechanical engineering and director of Johns Hopkins' Haptics Exploration Laboratory, Okamura has made it her mission to figure out how to infuse robots with a human-like sensitivity to touch — and to help robot-assisted surgeons like Yuh practice safer medicine.

For those who have lost their ability to feel, even simple tasks — dialing a telephone, using a tool, or moving about — can be nightmarish. Okamura started the lab from scratch seven years ago, and it has since grown in both size and stature. The lab's researchers consult with surgeons and scientists from a variety of the university's medical and engineering departments on a multitude of projects: finding better ways to provide haptics information on video screens, replicating a sense of feel through sight; looking at how to refine the feedback that machines like the daVinci transmit to surgeons to make the robots less costly to use and to give surgeons precisely the feelings they need to operate; and creating "haptic scissors" that would send sensations to surgeons as they cut.

"They're really on the right track," Yuh says. "They understand that for a surgeon, the holy grail of haptics is to feel force as if he were using a regular, hand-held instrument. It's still years away, but it's on the horizon."

Haptics, from the Greek haphe, means related to the sense of touch. Like our other senses, touch tends to be taken for granted. But for those who have lost their ability to feel, even simple tasks — like dialing a telephone, using a tool, or moving about — can become nightmarish.

When applied to machines, haptics usually involves feedback — augmenting robotic arms or hand-held devices with the ability to transmit sensation so humans can find more and better uses for them. For a futuristic example, think "the feelies" in Brave New World, Aldous Huxley's 1932 novel. In that sci-fi classic, characters who grabbed metal knobs on their chairs not only saw and heard movies, but felt them on their skin — right down to the fur of the bearskin rug on which a pair of actors made love.

Haptic feedback's real-world beginnings have a bit of a sci-fi twist as well: Humans who used robots to handle radioactive waste and materials in the 1950s were aided by haptics sensors that let them know if containers had been grasped properly. Robots that explored space and the bottom of oceans also delivered a rudimentary kind of haptic feedback.

It wasn't until the 1980s, however, that haptics research began to snowball. Since then, advances in computer technology have speeded up the electronic systems needed to accurately and quickly transmit "feelings" from sensors to users.

"It's all motors and sensors now," says Okamura. "The mechanical side hasn't really improved, but with the development of the virtual world and the increase in computational speed, there's much, much more we can do."

In our brave new world, haptic feedback straddles the entertainment and medical realms. It's used to re-create touch in everything from video game joysticks to the BMW iDrive control knob, designed to help drivers adjust air conditioning, music, and the navigation system by feel. The control knob transmits haptic information to the hand so that, for example, as the driver moves from one temperature setting to another, she feels the knob kick back slightly with each adjustment. She can feel what she has done without taking her eyes off the road.

With about 2,000 haptic feedback researchers in North America, the field is booming. At Hopkins, much of the work being done aims to help surgeons perform difficult procedures more efficiently and safely, or to invent an exoskeleton that would help people with nerve disorders feel again, or to create artificial limbs that replicate the functions of missing arms or legs.

Robotics, along with stem-cell research, will drive much of the innovation in medicine in the coming decades. Mohsen Mahvash Mohammady, an assistant research professor at the Engineering Research Center for Computer-Integrated Surgical Systems and Technology (ERC CISST) at Johns Hopkins, and a fixture in the haptics lab, says that collaboration is the key to the lab's success. "Without a doctor's input, I would be able to develop a nicely controlled robot, but I wouldn't be able to incorporate what surgeons need," says Mohammady, who is working on developing haptic scissors, as well as finding the best ways to retrofit the daVinci with the most useful types of force feedback.

Surgeons have told Okamura that they only need a few of the seven degrees of freedom — the seven ways the daVinci's arm can move — to operate effectively. By eliminating the types of movement that doctors don't need, the machine will become simplified, she says, and expensive parts — some of which cost thousands of dollars and must be replaced regularly — will be eliminated.

In the back of the lab, where a daVinci surgical robot sits awaiting modifications, Lawton Verner does the hands-on work to make it more efficient. (Hopkins is the only academic research institution to feature a nearly complete system. There are about 400 daVinci machines worldwide.) The 26-year-old PhD candidate in mechanical engineering came to Hopkins from Nebraska, where he had worked with a surgeon to research the possibilities of creating a machine that could "read" what surgeons are doing by the movements they make.

Carol Reiley maneuvers one robotic arm by moving another. Behind her, Verner wears a head-mounted display, a visual system used in some robot-assisted surgeries.

These days, Verner looks to instill the daVinci with partial-force feedback — a system that takes the force surgeons feel in their hands from the robotic arm and limits it to what they actually need to feel to complete a procedure. "The more factors we figure out are important, the more we can take out, so there are fewer sensors and motor devices," says Verner. "It makes the machine less costly, as well as more sensitive to what the surgeon actually needs. As an engineer, part of my job is to not just make it a strong surgical tool, but to make it practical. Better tools lead to better outcomes. They reduce patient suffering and recovery time."

The possibility of participating in such breakthroughs, along with some encouragement from Okamura, is what landed Verner at the haptics lab. Matching robots with haptics will become a major focus of his engineering career, he hopes.

"It's a really good way to affect medicine right now," Verner says.

During a lull in research on a day during winter break, Okamura explains her work to a local radio station team on hand to tape a feature story. But the lack of an audible dimension makes it tricky: The robots Okamura and her students work with might save lives, but they don't bleep or blorp. They're not battlebots that pound and screech, or pet-like robots (list price: $129.95) that wheeze as they vacuum the linoleum.

A sound technician unsuccessfully tries to coax some decibels out of the robotic arm of the daVinci. The arm picks at a Petri-sized dish full of plastic replicas of human tissue that look like polyps — the dish resembles something you might find growing at the bottom of a sophomore's refrigerator. As the technician fecklessly pokes around with his audio boom, Okamura stands in front of a board on which phrases like "redundancy algorithm" and "gravity compensation" stand out in blue, and translates into plain English some of the basics of haptic feedback.

"Robots are a physical way to use information," Okamura says. "And the daVinci is a large robot. The larger a robot is, the harder it is to get haptic feedback."

The public's fascination with robots has no doubt driven much of the media attention given to the lab; the formula looks something like this: "space age" + "cutting edge" = news. Okamura, who explains things clearly and in layman's terms, is getting used to being a media darling — a strange place for an investigator/engineer to be.

Russ Taylor, Engr '70, helped hire Okamura, now 34, to work with the center's other engineers on robots and information technology that could be used in minimally invasive surgery. Taylor is a professor of computer science at the Whiting School of Engineering and director of its ERC CISST, a consortium of universities that do high-level medical robotics work. The $30 million center, which is based at Hopkins, features scientists who develop "machines working with humans to do things neither could do alone," says Taylor, who is known as "the godfather of medical robotics" to his peers.

Intuitive Surgical Inc., a Silicon Valley company that developed the daVinci surgical system, chose Hopkins to develop haptics for it — in part because Taylor had patented some of the daVinci technology. The company also chose Hopkins over several innovative West Coast universities because of its medical reputation. "While it's true that there are many excellent schools nearby, we are attempting to provide a world-class solution to our customers' needs," says William C. Nowlin, senior director of research and software systems development at Intuitive Surgical. "We certainly view their work as high-caliber."

Because of Hopkins' pre-eminent role in medicine, it has become a natural landing place for engineers like Okamura and Taylor, and for research that marries engineering with care for patients. "I came here because of the center — the whole idea of surgeons and radiologists mixing with engineers excited me," Okamura says. "The quality of the people here and the interaction with the medical school allows us to do things that are unparalleled elsewhere." Then a 27-year-old wunderkind, she chose Hopkins over several other universities after receiving her PhD from Stanford.

The daughter of PhD chemists in Riverside, California, Okamura pursued physics as a high school student. But soon she began to look elsewhere for inspiration. "I liked the idea of physics, but after high school, I thought that if I got into physics I would be dealing with things that are too large — like the cosmos — or too small," she says. "I like working with things that are my scale. I like working with my hands."

During her undergraduate years at the University of California-Berkeley, Okamura began to design and build machines that performed tasks. By her first year of graduate school, she was doing research regularly. "The idea that you could get a robot to affect its environment in an intelligent way was fascinating to me," she says. At the time, engineers who wanted their machines to act on their own dominated robotics research. Okamura's doctoral thesis was built around creating a robot that would explore an object unknown to it, then determine what it was by its shape, weight, and texture. "It didn't do so well," she says.

Okamura chose her side in the ongoing debate in robotics — autonomous machines vs. ones designed to interact with humans — after working for Immersion Corporation, a San José company that develops haptic devices. "The frustration of doing autonomous things made me think I should go in a new direction," she says. "I was intrigued by the sense of touch."

Okamura built Hopkins' haptics lab from scratch. Shortly after arriving here, she wanted to work on medical robotics but, she says, "I had no idea how to get started." Mentors from the ERC taught her how best to write grant proposals and integrate work on robotics with researchers from other disciplines. She apprenticed for two years under a host of Hopkins engineers. After a pair of cardiac surgeons — Yuh, and Vincent Gott, a since-retired professor of surgery — approached her in 2001 with their concerns about breaking fine sutures while performing delicate surgery, Okamura and crew were off and running.

The haptics lab gradually has built up the grants it receives from the National Institutes of Health, the National Science Foundation, and DARPA (a Pentagon-based research agency) to about $500,000 a year — enough to stay on the leading edge of medical haptics research.

"In order to succeed in a research field, you need a critical mass. Allison being here helped make that happen in haptics," says Taylor, citing the support the lab has received for its work.

"The idea that you could get a robot to affect its environment in an intelligent way was fascinating to me," says Okamura. As a result, student researchers like Carol Reiley, a 24-year-old PhD candidate in computer science from Vancouver, Washington, can apply her knowledge of information technology and engineering to the field. Reiley deals in "augmented reality," what might be called "virtual haptics" — visual representations of the landscape inside a patient that a surgeon has to work in. Her approach to giving doctors haptic feedback is one sense removed from the robot-centered one. Surgeons who use the system Reiley is developing won't receive force feedback through their hands. Instead, they'll get haptic information delivered to their eyes during surgery via flashing lights on a computer screen.

Like many other haptics lab students, Reiley works with the daVinci system, trying to eliminate its drawbacks. The sensors that relay the feedback from the daVinci's arm to the surgeon have to be sterilized, a process that causes them to lose their sensitivity after 10 or so uses. They must then be replaced — at a cost of as much as $6,000 each.

In Reiley's augmented reality system, surgeons can see how much pressure their surgical device is exerting on tissue — they can tell by watching the color of a quarter-sized dot on the monitor. Working with thresholds set by Yuh, Reiley applied the traffic-light standard to her system: Green means the device is applying little or no pressure; yellow, moderate pressure; and red, a surgeon is in danger of constricting a vessel.

"Visual representation is still a form of haptics, but it's less direct," explains Reiley. "We thought about using auditory cues instead, but with all the noises you have in a typical operating room, it wasn't as attractive." Her system uses gauges that cost pennies instead of the pricey sensors. If she could come up with a way to sterilize the gauges, daVinci-aided operations would be much less expensive. Ideally, Reiley adds, she'd like her system to work with more machines than just the daVinci. Surgeons who have tried her system have broken fewer sutures. "It's clear that they're gauging the force correctly from the visual cues," Reiley says.

Okamura and crew are darlings not only of the media, but of the Hopkins development staff as well. Tours for alumni and prospective students are regular features at the haptics lab, where, despite its nondescript look and sound, the wow factor from playing around with touch-sensitive robots makes it a natural spot for the curious.

"The fact that the lab is hands-on and deals in robotics in general makes it attractive," says Rob Spiller, associate dean of development and alumni relations at the Whiting School. "The lab is an intersection of different disciplines working together to benefit society. We're proud of it."

Spiller says that his department gives at least 10 alumni tours of the haptics lab each year. What's more, Okamura has taken part in several national speaking tours, along with other Hopkins robotic professors. She leaves her audience happy, Spiller says. "They're impressed with the work the lab does, that it deals with life-and-death issues. But they're also impressed with Allison and her wonderful ability to make things understood without eliminating all the sophistication that comes with doing cutting-edge work," he adds.

That work will continue to grow, Okamura says, as we better understand how humans process haptic information. She has been exploring those questions as she collaborates with other Hopkins scientists, including Amy Bastian, associate professor of neurology, and Steven Hsiao, associate professor of neuroscience at the Zanvyl Krieger Mind/Brain Institute. Among the areas they are investigating: how humans experience texture, vibration, and shape; how we perceive softness; and if it is possible to develop human models based on haptics. "It's important for me to know how humans use haptics to perform tasks, so I can apply that in my work," says Okamura, who adds that this research is still in its early stages.

She'd also like to figure out a way that robots could create a model of a patient's tissue, so haptics could then be adapted to the density of it. And there's the ultimate hope of drawing up a haptic model of a person, so surgeons would know the nature of the patient's tissue, blood oxygenation level, and several other factors. "It would be great to have a haptic human," Okamura enthuses.

Okamura is also working with Nitish Thakor, a professor of biomedical engineering at Hopkins, to develop an artificial arm that would use sensors to replicate a sense of touch. Another approach involves creating vibrations in a foot to simulate the sensations made when a prosthetic finger runs over a texture. Thakor and Okamura are also looking at ways to infuse prosthetic devices with proprioception — the wearer's ability to know what position limbs and joints are in without looking at them.

Verner, Okamura, Reiley, and Mohammady outside the engineering lab.

"Our lab has done research and deemed that proprioception is necessary" for people to use prosthetics successfully," says Okamura. "If you can judge where the arm and hand are using an intuitive method, then it will help them. Also, it will make them want to use the prosthesis."

Sometimes, lab investigators will veer slightly away from straight-on haptics research if investigators see a medical need they can address. One of the more intriguing cases involves creating a "steerable needle" that a surgeon could thread through a patient without damaging organ tissue or nicking blood vessels. By making the needle asymmetrical and steering it by its beveled tip, a surgeon could conceivably make it twist more accurately through the body — potentially a considerable aid in performing biopsies or injecting substances into tumors to slow or stop their growth.

One of the admirers of Okamura and her crew's work on the steerable needle is Taylor. "It's an extremely promising technique, but they need to develop a theory as to how you model tissue density so a surgeon would know how to control the needle," he says. "Overall, it's a wonderful example of how to do important research into how you combine modeling with medical applications. There are only a few places in the world that do this kind of work."

As she churns out her latest round of grant proposals to investigate some of these new areas, Okamura looks forward to moving the lab to a larger, state-of-the-art facility. This fall, the $36 million Computational Science and Engineering Building will open on the new Decker Quadrangle. Okamura is excited about the move. "It'll give a huge shot of adrenaline to our whole robotics group," Okamura says of the lab space-to-be.

"It's just too bad they rejected my idea for a fire pole from my office down to the lab."

Baltimore-based freelancer Mike Anft wrote about political science professor Matt Crenson in the November issue of Johns Hopkins Magazine.

Return to April 2007 Table of Contents

  The Johns Hopkins Magazine | 901 S. Bond St. | Suite 540 | Baltimore, MD 21231
Phone 443-287-9900 | Fax 443-287-9898 | E-mail jhmagazine@jhu.edu