Nobody put a gun to Beppino Giovanella's head and said, "Take this or else." It was only his desire to test his own theory that made the biologist swallow a gelatin capsule containing 100 milligrams of an experimental cancer drug, then wash it down with orange juice. That and his belief that he should be the first person to try it.
Giovanella had already demonstrated that the drug, a camptothecin derivative, had no untoward side effects in animals. It was time to test it in humans. But the jump from animals to people can bring surprises, says Giovanella, who has collaborated with Hopkins investigators on several research projects and is director of laboratories for the Stehlin Foundation in Houston. "As a biologist, you become acutely aware that drugs at times act very differently from one species to another," he says. "That is why I always test new drugs on myself first. It wouldn't be very nice to risk another person before I risk myself."
Like a modern-day Dr. Jekyll, Giovanella chose himself as a guinea pig. But instead of the hirsute Mr. Hyde, he ended up with a serious case of alopecia--he became as bald as an egg. His white cell count also plunged to half its normal level, and his blood platelets dropped dramatically. Giovanella is light-hearted about the experience. "For a time I was going around like Kojak," he says, in an accent as rich as his native country's Parmesan cheese.
By ingesting camptothecin, Giovanella was carrying out a scientific tradition that dates back hundreds of years: scientists as the ultimate navel-gazers, selecting themselves as their own experimental subjects. Their reasons include ethics, economics, curiosity, impulsiveness, and even a guarantee of subject compliance. As Giovanella says, "I have me with myself all the time."
Fortunately for Giovanella, his self-experiment left no permanent injury. His white cell and platelet counts quickly recovered. His hair is now wavy and even thicker than before. Colleagues joked that he'd discovered a new anti-balding tonic. Best of all, Giovanella's experiment yielded valuable scientific information: that doses proportional to those that work in dogs are too much for people. The fact that he lost his hair, he concluded, indicated that camptothecin remains in the body much longer in humans than it does in dogs. The 100 milligram dose was too large. A 4 milligram dose would be prescribed today.
Partly as a result of his self-experiment, today the drug is in phase II clinical trials. Just as important, says Giovanella, he can now speak from personal experience to patients who are candidates for the drug. "I feel very confident in saying, 'Take it. It's not too much toxic.'"
Today, an investigator like Giovanella, who has tested several drugs on himself without seeking the formal approval of his institution, is a rare bird and a bird who risks a reprimand. (The Stehlin Foundation has no official policy on self-experimentation, says Giovanella.) The nature as well as degree, of self-experimentation is becoming more carefully scrutinized and more regulated. At scientific institutions that receive federal funding, such as Hopkins, investigators wishing to test drugs or devices on themselves need first to jump through the hoops of an approval process.
In the early 1980s, the Johns Hopkins Medical Institutions became one of the first (if not the first) research institution to include a policy on self-experimentation in its guidelines on human subject research. The policy states that Hopkins investigators who want to volunteer for their own experiments should be treated no differently than other research volunteers. They must sign an informed consent form and go through the other approval steps required by the university's Joint Committee on Clinical Investigation (JCCI), which oversees research at the East Baltimore campus. The Bayview Campus's institutional review board (IRB) has the same policy. The policy's stated purpose is to protect the investigator from "taking unwarranted risks in the excitement of generating new knowledge," and to safeguard the process of clinical investigation "in an era when the process has articulate critics."
What drove the JCCI to enact the policy was a request by Henry Wagner, then director of nuclear medicine and radiation health, to experiment on himself. And what drove Wagner to seek approval for the experiment was a good healthy dose of scientific competition.
"It was like the Roger Bannister model of scientific investigation," says Robert Dannals, director of the PET Center and a member of the team that collaborated with Wagner. Every researcher in the country with access to a positron emission tomography (PET) scanner was racing to become the first to use it to visualize receptors in the brain. Such a technique would reap a plethora of insights into how neurons communicate, how drugs alter brain function, and even on the mechanisms of diseases like schizophrenia and epilepsy.
While PET scanning was not new--scientists had been using it for a decade to visualize blood flow in the brain--no one had yet used it to visualize receptors for neurotransmitters like dopamine and adrenaline.
So after gaining JCCI approval, Wagner received an injection of a radioactively labeled drug called N-Methylspiperone, which, in theory, bound to dopamine receptors. Then he sat very still while a doughnut-shaped device around his head received the radioactive signal emitted from his brain.
The first results were a success. The PET scans revealed that the radioactive tracer lit up two regions of the brain, the caudate and the putamen, where in vitro studies had indicated dopamine receptors lay. Further, the experiment opened the door to a new branch of neuroscience. In thousands of PET studies at Hopkins since then, investigators have learned, for example, that the levels of dopamine receptors are elevated in schizophrenics.
Since Hopkins's policy on self-experimentation was established, the JCCI and the Bayview IRB have received only a couple of applications involving self-experimentation. One was from a team of pharmacologists who sought approval to test the effects of caffeine abstinence.
Then there is the question of what goes on behind closed doors. Do scientists conduct experiments on themselves without seeking official approval? "I suspect it happens on occasion," says Gary Briefel, an associate professor of medicine who is chairman of the Bayview IRB.
Science history is rich with stories of self-experiments, dating back at least as far as Sir Isaac Newton, who is believed to have sampled concoctions from his own alchemical experiments. (His hair shows high concentrations of arsenic, mercury, gold, and lead.) At Roosevelt Hospital in New York in the 1880s, surgeon William S. Halsted became addicted to cocaine while testing its use as an anesthetic. He remained addicted, first to cocaine and later to morphine, throughout an illustrious career at Hopkins Hospital. Pierre Curie strapped a radium pellet on his arm on test the theory that radium would burn skin. The first cardiac catheterization was conducted in 1929 by Werner Forssmann, by himself, on himself.
Hopkins's Philip Russell, a professor of international health at the School of Public Health and a retired director of the Walter Reed Army Institute of Research, helped develop vaccines for dengue fever, Japanese encephalitis, and other diseases during the 1960s through
'80s. He used to try out his vaccines first as a matter of course. "If the individual isn't willing to take it you don't have much faith in the underlying safety," says Russell.
More recently, in the 1980s, French researcher Daniel Zagury demonstrated the safety of an experimental AIDS vaccine by injecting it into his own arm. Australian microbiologist Barry Marshall gulped a teeming broth of Helicobacter pylori to prove that the bacterium causes gastric ulcer. And perhaps the most inexplicable case was the researcher who burned to know whether he could infect his own ear with cats' ear mites. When he had success in one ear, he tried the other.
One of the most well-known self-experiments of the first half of this century involved a dashing young Hopkins physician by the name of Jesse William Lazear (Class of 1889).
The year was 1900, and American troops were stationed in Cuba. The Spanish-American War had ended, but in some regions an enemy greater than the Spanish still threatened to vanquish American troops: yellow fever. The deadly scourge, which damages the liver, caused its victims to suffer infernal fevers, become jaundiced, and vomit black bile. No cure was available, and "yellow jack" had claimed thousands of lives throughout North and South America.
While scientists now know that mosquitoes transmit the virus that causes yellow fever, in the 19th century no microscope could detect viruses. A Cuban physician named Carlos Finlay had proposed that mosquitoes were the vector, but many scientists dismissed his idea. The popular notion was that the illness was contagious, caught from victims' bedclothes, blood, or vomit.
Spurred to protect American troops as much as to solve a medical riddle, U.S. Army Surgeon General George Sternberg recommended that a team of scientists headed by Army surgeon Walter Reed pin down yellow fever's source. In addition to Reed and Lazear, the commission included physicians Aristides Agramonte and James Carroll. (Carroll and Reed had met while studying bacteriology under William Henry Welch at Hopkins Hospital.) They gathered in June 1900 at a military camp eight miles outside Havana.
The commission first hunted for a bacterial source of yellow fever and, failing to find one, decided to pursue the theory that mosquitoes were the vector. They pledged to use themselves as test subjects.
Lazear, Carroll, and an enlisted volunteer named James Dean allowed mosquitoes that had fed upon hospitalized yellow fever patients to be placed on their arms. Agramonte did not participate because he was thought to be immune to yellow fever. As for Reed, while he has been credited with solving the yellow fever puzzle, he in fact left Cuba soon before he was scheduled to participate in the experiment. According to some accounts, he was ordered back to the U.S. by Surgeon General Sternberg. But no record of that order has been found. Some historians believe that Reed could have postponed his trip and stayed for the experiment.
Carroll contracted yellow fever and recovered, as did Dean. Lazear, however, died of the disease on September 28, 1900. Ironically, he apparently became ill not as a result of the group experiment, but from a mosquito bite he received afterward.
The commission's self-experiments led to more extensive tests involving larger numbers of volunteers. By the turn of the century, the scientific community accepted the fact that mosquitoes were indeed the vector and implemented mosquito control measures that have drastically reduced the incidence of yellow fever. In the words of A. McGehee Harvey, Hopkins Distinguished Service Professor emeritus of medicine, the commission's work was "one of the greatest achievements of medical science."
Today, in a busy corridor at Johns Hopkins Hospital, a brass plaque embossed with gold lettering commemorates Jesse Lazear. The inscription includes the following: "With more than the courage and devotion of the soldier, he risked and lost his life to show how a fearful pestilence is communicated and how its ravages may be prevented."
For decades following the experiments, the story lived on, says Susan Lederer '77, an associate professor in the Department of Humanities at Pennsylvania State University College of Medicine in Hershey. In her book Subjected to Science (The Johns Hopkins University Press, 1995), she has written about the furor of popularization that followed. For example, in his best-seller Microbe Hunters (1926), Paul De Kruif portrayed Lazear et al as brave beyond description. A Broadway play starring Jimmy Stewart and a Hollywood film featuring Robert Montgomery soon followed.
This kind of hero worship was common during the first half of the century, says Lederer, who points to a variety of press accounts that highlight the fearlessness of various self-experimenters. In 1936, for instance, Newsweek extolled Cambridge physiologist Sir Joseph Barcroft for his "deliberate flirtations with death." Barcroft apparently wanted to see how the central nervous system responds to cold. So, wrote Newsweek, "for 30 minutes to an hour he would lie naked on a couch in the sub-freezing temperature and record subjective reactions."
Lederer contends that medical scientists used stories about such "heroic" acts to defend themselves against criticism from antivivisectionist groups--a defense they needed. The Antivivisection Society and its many allies flourished in the first part of the century, charging medical scientists with wanton exploitation, carelessness, and abuse of human and animal subjects. Antivivisectionist and novelist Elizabeth Stuart Phelps Ward railed that "our medical men are becoming a race of carvers," Lederer reports in Subjected to Science. "Dog or man, cat or baby, it does not matter so much_the fashion is to slice."
Some antivivisectionist arguments were based in fact. Into the 1950s (and in some cases longer), physicians had experimented on prisoners, orphans, soldiers, and even newborn infants without the full consent of the subjects, or by deceiving subjects about the purpose of the intervention. Some people had even been inoculated with infectious agents such as cholera or syphilis.
Some antivivisectionists, however, were ideologically opposed to all experimentation, and to the extent that law-makers believed them, they threatened to curtail medical research in general.
The self-sacrifices of researchers like Curie, the yellow fever scientists, and others, then, became the perfect public relations weapon to lob at groups like the Tailwaggers Association of America, a society of dog lovers with an antivivisection stance, which was presided over by Bette Davis. California medical schools even convinced Metro-Goldwyn-Mayer to produce two short films aimed at ensuring the continued sale of pound animals to medical institutions. The shorts featured actors who had portrayed doctors in feature-length films; one also included a portrait of Jesse Lazear.
In the 1950s, scientists organized the Walter Reed Society for scientists who had served as subjects in experiments, says Lederer. The application to the now-defunct organization asked candidates to describe their experiences "in simple, non-technical language... Be as colorful, dramatic and specific as possible." The wording was no coincidence, says Lederer. "The Society wanted ammunition against the animal activists."
"The ability to point to an investigator's willingness 'to offer himself as a sacrifice' enabled [scientists] to take the moral high ground and to earn public support," Lederer writes. She says, "If people hear, 'Well, I used myself as a test subject first,' they'll be more likely to say, 'Okay, use me or use my dog.'"
In 1952, a 5-year-old boy named Darrell Salk (MD '74) received an injection of a new experimental polio vaccine. His brothers, Peter (MD '69) and Jonathan, received the vaccine as well, from their father, Jonas Salk, who had developed it.
"I was a kid getting an injection because it was going to be good for me because my parents said it was going to be good for me," says Darrell Salk, who is now a biotech consultant in Seattle. The specter of polio loomed large. Polio wards were filled with children in rocking beds and iron lungs.
While use of an unlicensed vaccine would be considered human experimentation today, Salk continues, it is not fair to judge those days by today's ethical standards. "At the time, experiments were designed using prisoners and children in institutions," he says. "Informed consent had not been developed as a requirement."
Many aspects of medical research were different, he notes. "The large-scale field trial in 1954 for the inactivated polio vaccine would be very difficult to do now--the speed at which it was accomplished, the ability to make decisions and changes, and to get the vaccine manufactured in a short period of time. Self-experimentation occurs in a very different context now than it did in the 1950s."
It is true that the ethical aspects of experimentation are inspected more closely today. Investigators are bound by more rules regarding their research, and are at greater risk for product liability suits. So if they try experiments on themselves without going through proper channels, they jeopardize not only their health but their careers. They also risk losing their health insurance or workers' compensation eligibility, should something go wrong. What seemed simple, straightforward, and heroic in the past might today be viewed as unethical, rash--even foolhardy.
Science itself is more complicated, notes Bayview's Gary Briefel. Whereas a century ago scientists were trying to understand basic human physiology, today they are asking more complicated questions, he says, and the studies required to answer those questions generally involve more than one subject. Experiments involve rigorous controls and carefully selected volunteers who meet certain physical and mental requirements. Many self-experiments would not meet today's criteria, if only because they are too limited. For example, although a scientist might demonstrate that a new drug does him no harm, he may possess an unusual tolerance to the drug.
Even if the risks are minimal, today's scientists still face a labyrinthine approval process. For example, Bayview psychopharmacologist Roland Griffiths risked only a headache and fatigue when he, along with six of his colleagues, ventured to stop drinking coffee as part of a study on caffeine dependence. The experiments meant abstaining from coffee, colas, and chocolate, and on certain days ingesting capsules containing caffeine doses or placebo. Yet, although the risks looked minute--who would think that eliminating a morning cup of coffee could become an issue?--the questions raised by the IRB were complex and involved.
The board wanted to know whether the self-experimenters would be able to be objective, and they were concerned that people might be coerced into participating. For example, a senior investigator might lean on a junior faculty member to participate, hinting that volunteering would lead to a promotion. The IRB also discussed whether authorship on a research paper resulting from the study would be scientifically appropriate or might be construed as over-compensating a participant.
"After many memos and head-scratchings and meetings, we got approval," says Griffiths.
The studies revealed that even a fraction of the amount of caffeine in a cup of coffee alters mood, and that abruptly quitting even one daily cup of coffee can produce withdrawal symptoms. Incidentally, adds Griffiths, "these turned out to be several of the most interesting, fun, and personally illuminating studies I've ever conducted."
So a U.S. president can't be a philanderer anymore, and neither can a scientist be a spontaneous self-experimenter. But it's not only scientists and review boards who have changed that. The public and the press overall are more critical and skeptical.
Today, reputable magazines and newspapers would not deign to report scientific adventures such as Sir Barcroft's refrigerator capers, unless they first had the assurance that his experiments had been published in a peer-reviewed journal or reported at a scientific conference. If they do report self-experiments, they couch the story in skepticism. Witness the Biosphere project, in which a group of men and women are living and farming in an enclosed self-sustaining bubble. Many news stories on Biosphere have questioned the project's scientific merits and insinuated that its goals are closer to science fiction.
With the increased scrutiny, how much self-experimentation is going on anymore? According to the official record, very little. But several researchers who have tried it suggest differently. "I think you would be very surprised by how many researchers are doing this," says Giovanella.
Self-experimentation will probably continue as long as science continues to attract people who are inquisitive. As one biologist says, "These are fascinating things, and people get fascinated themselves."
Melissa Hendricks is the magazine's senior science writer.
Send EMail to Johns Hopkins Magazine
Send EMail to Melissa Hendricks
Return to table of contents.