Johns Hopkins Magazine - September 1996 Issue

Sensory Surrogates

By Melissa Hendricks
Diving for a Golden Fleece

In the undersea mountain range known as the Mid-Atlantic Ridge, nature rivals the best creations of Jules Verne. Hot springs belch black smoke, and volcanoes shoot lava into the cold and dark water. Giant clams and tube worms the size of garden hoses blanket the dark waters. Primitive bacteria survive the lack of sunlight by getting energy from sulfur.

Into this otherworldly scene ventures another strange creature. It looks like a whale and flies like a helicopter. It is a robot named Jason.

Jason has the equivalent of the eyes, ears, and hands of a skilled and intrepid human explorer--and it never gets the bends. The robot's sonar "ears" navigate the ocean floor. Its camera "eyes" see stark black hills of lava, undulating sea grass, and temples of sulfur. Its arm scoops up chunks of the sea floor and captures a rasher of tube worms. In addition, more than a dozen sensors monitor the robot's latitude, longitude, pitch, roll, yaw, pressure, and acceleration. All the information gathered by Jason is fed back to a computerized control system--a.k.a. Jason's brain, of which Louis Whitcomb is the chief architect.

An assistant professor of mechanical engineering at Hopkins's Whiting School of Engineering, Whitcomb is also designing a new robotic arm for industrial use.

"Our goal is to build devices that extend human capabilities," says Whitcomb. "Basically, I build robots that go into places you or I don't want to be." In addition to conducting scientific research, underwater robots inspect and maintain piers, dams, and tunnels, which make them ideal for use by oil companies.

Jason, which is overseen by the Woods Hole Oceanographic Institute (WHOI), can dive as deep as 6,000 meters and stay submerged for several weeks at a time, says Whitcomb. In contrast, the best human divers can descend at most to about 100 meters; any farther and their blood becomes saturated with noxious gases. Funded by the National Science Foundation and the Office of Naval Research, Jason has been on dozens of scientific missions, including one to explore the sunken Lusitania (off the Irish coast) in 1993.

This summer, Jason was shipped via commercial liner from Woods Hole to Barbados, where it was placed aboard the science research vessel, the Knorr. The mother ship sailed to a site several hundred miles west of the Azores, where, 2,000 meters below, a 20-kilometer-long section of the Mid-Atlantic Ridge called Lucky Strike lay waiting Jason's inspection.

The Ridge is like the Grand Central Station of the seafloor. Running the length of the Atlantic, it is the meeting point of tectonic plates, or the "zipper" of the Atlantic, as Whitcomb likes to say. It bustles with marine life and geological activity.

Lucky Strike and other deep sea hydrothermal systems are as enticing as candy to geologists and biologists. "How do they form?" says Whitcomb. "As soon as they form, they're teeming with life. How?" It was Jason's task this summer to collect fluids, samples of seafloor, and organisms to help scientists answer these questions and to further work begun on previous missions. Compared to what scientists know about terrestrial life and geology, they have barely scraped the surface of this undersea world. In the past 20 years, scientists working with Jason and other vehicles have identified dozens of new species, such as the tubeworm Oasisia alvinae (named after the Alvin) and the shrimp Rimicaris exoculata, living on or near the vents.

Roughly the size of a Volkswagen Beetle and the shape of a whale, Jason is driven by seven thrusters similar to those found in trolling motors used in freshwater fishing. It is tethered to a sea-surface vehicle, which in turn is tethered to the mother ship, by a nine-kilometer-long cable. The cable supplies Jason's power and transmits the data the robot collects back to a mission control station on the Knorr. Inside mission control, a pilot steers Jason by manipulating a joystick, and computers and video consoles display data and real-time scenes of the seafloor as they appear through Jason's cameras. (Part of Jason's brain is technically in mission control, and part is in the robot itself, notes Whitcomb.)

Jason usually has numerous assignments on its agenda, says Whitcomb, who helped navigate Jason on a previous expedition. The robot cruises along the bottom, punching out samples of the sea floor with a tool known as a "cookie-cutter" and sucking up marine organisms with its "vacuum cleaner." It deposits these finds in a basket. When a basket is filled, a member of Jason's team triggers an acoustic release mechanism, and the basket rises to the surface like a hot air balloon.

In order to navigate the robot, team members use sonar to figure out where the robot is in relation to the ocean floor. The team anchors transponders, each emitting a different frequency, to the seafloor and maps their geographic positions. As Jason flies over the network, it emits sound waves at a frequency of 9 kilohertz. When a transponder "hears" the signal, it responds by emitting sound waves of its unique frequency. To compute the distance between Jason and a particular transponder, Jason's brain multiplies the time it takes sound to travel between Jason and the transponder by the speed of sound in water.

Despite Jason's abilities, says Whitcomb, in some ways it is relatively limited. "We would like to be able to say, 'That's an interesting rock, hold position on this rock,'" he says. If Jason's cameras could "lock onto" designated objects, marine biologists, for example, would be able to observe organisms in their natural environment over extended periods of time. But getting a robot to do that in an ocean is no simple task. "In the ocean, a stopped vehicle still moves. Just holding a position requires active input."

As a first step, Whitcomb, along with engineer Dana Yoerger and biologists Scott Gallagher and Cabell Davis, of WHOI, have built a prototype of an improved computer vision system for Jason. The system is basically a computer program connected to Jason's cameras. When the robot is instructed to "Hold station," the vision system keeps the camera's crosshairs locked onto a designated object. The program receives a new image of the target 30 times per second. If the object appears to be moving away from the intersection of the crosshairs, the system transmits a command to Jason's thrusters to shift direction, which brings the target back into alignment.

Whitcomb tested the prototype last summer off the dock in Woods Hole. Jason was asked to "hold station" on Ping Pong ball-size spheres that were anchored to the sea floor in a few meters of water. The system worked well--for a prototype, says Whitcomb. Next come tests on rocks, mud, and plankton. Says Whitcomb, "This is a great sandbox to play in."

The surgeon's "third hand"

"Zeroing the robot. Step aside," intones a monotonic, vaguely foreign voice. A black metal structure roughly the size and shape of an arm bent at the elbow and wrist rises from a vertical shaft. In the area where a hand would be, the device holds a surgical endoscope, a thin, foot-long lighted telescope used to view internal organs during the minimally invasive surgical procedure known as laparoscopy.

The various "joints" flex, extend, and rotate, one at a time, in a choreographed routine that is technically perfect but has nary an ounce of style. "It's going through its initialization to make sure it knows where all its joints are," says the robot's mastermind, Russell Taylor, professor of computer science at Hopkins' Whiting School of Engineering.

"Zeroing procedure completed," says the robotic voice. The robot is now ready to assist at laparoscopic surgery.

Taylor says the robot, whose name is LARS, "was built to act as a third hand for surgeons. It is not the surgeon. It is a smart surgical tool that extends the surgeon's sensory motor capabilities."

Designed by scientists at Hopkins and IBM, LARS can be commanded through a joystick or computer "mouse" to go left and right, up and down, and to zoom in on a field. So far, Hopkins surgeon Mark Talamini and Louis Kavoussi, chief of urology at the Hopkins Bayview Medical Center, have tested LARS only in the laboratory (though they have used another surgical robot, AESOP, on patients).

In conventional laparoscopy, surgeons insert long, thin cylinders into a belly-button-size incision in the patient. They then insert an endoscopic camera and surgical instruments into the cylinders, and operate while watching the video image transmitted by the camera. The procedure leaves only tiny scars, and thus its popularity over the past few years has exploded. The majority of gallbladder removals are now done through laparoscopy.

But laparoscopy is also a different sensory experience for the surgeon. Because surgeons are not placing their hands directly into a patient's body, they must rely less on their own tactile sensations and more on visual feedback from the camera. Robotics can give the surgeon even more feedback, and also make the technique more versatile, says Taylor. For example, in conventional laparoscopy, the surgeon generally relies on an assistant to manipulate the camera.

With LARS, the surgeon can control both the camera and the surgical tools. To change the camera angle, the surgeon simply points and clicks with a mouse mounted on the surgical instrument, or with a joystick. In addition, says Kavoussi, LARS and surgical robots in general "are more accurate than people. They are more precise. They are untiring." Such improvements could translate in the future to speedier recoveries for patients, says Kavoussi, who has studied the robot's accuracy. In one test, LARS dramatically outperformed surgeons at locating a test point.

Due to its precision, says Kavoussi, LARS or a future generation of the system will be an excellent tool for draining abcesses, or conducting chest or kidney biopsies, a procedure that requires very precise placement of a needle.

In the future--"only after additional development and rigorous testing," stresses Taylor--LARS may also be used to shrink cancerous tumors by injecting radioactive seeds directly into them. A doctor would first review a set of CT scans of the patient's liver, use a computer to decide where to implant the radioactive seeds, and then program LARS to hone in on those spots, Taylor explains. Doctors currently perform this procedure, says Taylor, but with less than bull's-eye accuracy. "We believe a robot will be able to deliver a pattern of these radiation seeds much, much more consistently than a human can do it."

Melissa Hendricks is the magazine's senior science writer.


Send EMail to Johns Hopkins Magazine

Return to table of contents.