Reforms aim to make medical research safer for its subjects
BALTIMORE » When University of Maryland School of Medicine researchers were looking for volunteers to test a vaccine for Ebola, which was killing thousands of people in West Africa, Andrea Buchwald raised her hand in Baltimore.
“Scientific curiosity,” the 29-year-old graduate research assistant in Maryland’s department of epidemiology said, explaining why — along with her trust in the system governing treatment of human subjects — she was willing to be experimented on.
“Consent for clinical trials is a very stringent process,” Buchwald added. “You’re expected to do your best to ensure your participants are fully informed and doing this of their own volition. Things have changed a lot since the 1940s.”
It was during that decade that hundreds of Guatemalans were infected with sexually transmitted diseases in the name of research — a horrific reminder of practices brought back to light last month when the Johns Hopkins University was sued for $1 billion by research subjects and their families for its role in approving federal funds for the study. Hopkins officials said the university didn’t develop or oversee the study and was not responsible.
The days when researchers used impoverished populations, prisoners, prostitutes, orphans and others as human guinea pigs are largely in the past, most would agree. In the most infamous case — the Tuskegee study that ran from the 1930s to the 1970s — black Alabama men with syphilis were left untreated so researchers could trace the terrible progression of the disease.
But even today, concerns arise periodically about the use of human subjects in clinical trials, especially as research institutions and pharmaceutical companies increasingly go abroad to test new drugs and vaccines in countries where oversight can be more lax.
Don't miss out on what's happening!
Stay in touch with top news, as it happens, conveniently in your email inbox. It's FREE!
In India, for example, a rash of reports several years ago of people dying during clinical trials, or being enrolled without proper consent, led to an uproar and a government crackdown on what had become a booming industry for the fast-developing country. While some say Indian officials overreacted, the events reflect the continuing unease with human experimentation.
“I think there is lingering fear and suspicion of research in many quarters,” said Dr. Daniel Kuritzkes, a Harvard virologist who had three research studies in India interrupted by the government’s scramble to enact new regulations.
“That’s unfortunate because for the most part, there has been worldwide adaptation of laws governing how human subjects are protected in research, said Kuritzkes, who chairs the AIDS Clinical Trials Group, a National Institutes of Health program that conducts research in conjunction with institutions around the world. “Things are done very differently than in the past.”
It is hard to imagine studies such as those in Tuskegee or Guatemala happening today, and indeed, the outrage over them led to ever-stricter safeguards to protect human subjects.
The issue has particular resonance in Baltimore, where for all its renown, Johns Hopkins still contends with the suspicion that it has exploited African-Americans who live near its east-side campus.
As fantastical as it may sound, there are still Baltimoreans who grew up hearing that Hopkins doctors might snatch them off the street and experiment on them, a local legend that persists and gains new traction with cases like that of Henrietta Lacks. The Baltimore County woman died of cancer in 1951 at Hopkins, and — as was common practice at the time — tissue samples were taken without her consent and used to create a cell line that was important to medical advances such as the polio vaccine.
The lead paint studies of the 1990s, conducted by a Hopkins partner, Kennedy Krieger Institute, also have sparked controversy — and lawsuits from parents who say their children were harmed.
“I tell researchers, ‘Every time you enroll an East Baltimore participant in your study and you treat them well, you take another chip at that old rumor,’” said Liz Martinez, a nurse who serves as Hopkins’ research participant advocate. “They’ll talk about the experience and how they were treated and what they were able to contribute to medical knowledge, and how they were not grabbed off the street.”
That the position of research participant advocate even exists today is testament to the change in how experiments are conducted. These days, there is much greater scrutiny of research proposals involving human subjects.
Researchers who receive NIH funding, for example, must get the approval of their organization’s Institutional Review Board, which determines whether a proposed study protects human subjects, properly weighs the risks and benefits to them and can document that they provided informed consent.
At Hopkins Medicine alone, there are six such boards that meet on its Baltimore campuses weekly for three hours apiece to handle the volume of research. The boards approve about 1,800 new protocols a year and oversee about 6,100 trials, according to Hopkins.
But the boards operate in private — Hopkins officials would not allow reporters to observe a meeting for this article — so it can be difficult to assess their work.
“There are tremendous amounts of variability,” said Laura Stark, a professor at Vanderbilt University who wrote the 2011 book “Behind Closed Doors: IRBs and the Making of Ethical Research.”
“The ethical thing to do in one system may not be considered ethical in another system.”
The University of Minnesota, for example, announced this month that it is overhauling its review process amid criticism of its psychiatric research. Ethicists have pointed to incidents such as the suicide of a schizophrenic man in 2004 while he was enrolled in a drug trial, questioning whether someone so disturbed could even give informed consent.
The Institutional Review Board system has its origins in a medical past when doctors had much freer rein. For example, she writes, in the 1940s and 1950s, Mennonites, Quakers and other religious objectors to war were put in service to their country as research subjects for the NIH.
Some of them were marooned on what is now Roosevelt Island in New York so scientists could study the minimum amount of food and water shipwreck victims might need, Stark writes. Additionally, NIH researchers would drive from their campus to nearby Jessup or to Lorton, Va., to avail themselves of prisoners for studies, she writes.
Once the NIH started funding more research off its own campus, officials realized they would need a way to make sure those institutions followed certain standards — and to limit their own liability should something go wrong.
“Basically, the review system got tied to money: If you wanted the money, you had to have a review process,” Stark said. “Medical research is big money: Who’s paying when there’s a lawsuit?”
The research community has long acknowledged the need to protect human subjects. Officials began enacting laws and regulations for researchers receiving federal dollars in 1948 with the Nuremberg Code, establishing the idea of consent. It was a response to German physicians who experimented on prisoners during World War II.
Then, in 1974, the U.S. passed the National Research Act to codify protections for research subjects. That led to the landmark Belmont Report, which spelled out principles of ethical treatment.
Dr. Christopher V. Plowe, a malaria researcher, said the rules were a “strong and appropriate reaction to Tuskegee, among others.”
A lot of current researchers began their careers after the rules were put in place and know no other way, said Plowe, the new director of the University of Maryland School of Medicine’s Institute for Global Health. Many researchers, like him, have even voluntarily strengthened the consent process overseas to address lingering distrust and ensure a study’s integrity.
Plowe said, for example, there are places where malaria, Ebola and cholera vaccines and treatments are tested, but illiteracy makes it difficult to obtain consent. He said researchers increasingly start by trying to gain the trust of village elders or others with influence, a process called “community permission to enter.”
In most cases, he said, researchers shouldn’t use disadvantaged populations overseas for studies that don’t benefit them. And there are other situations, such as with refugees, that require more scrutiny because potential subjects may feel coerced.
“We want to eliminate malaria in Myanmar, but there has to be community buy-in and political will,” said Plowe, who is working with local health professionals, doctors, the government and even military officials. “We can sit down and talk about health issues. Malaria, everyone agrees, is something we’d like to get rid of.”
Falguni Sen, who directs the Global Healthcare Innovation Management Center at Fordham University, says problems with medical trials conducted abroad can result from a cultural gap. A concept like informed consent may have no real parallel in areas of the developing world.
“What is it in us — we’re all doing things to improve human lives, we want to do good — what allows us to get into a Guatemala situation?” he said, referring to the STD experiments that led to the suit against Hopkins. “What it is, is we don’t fully understand our cultural differences.
“In the Third World, there is no universal health care, there is no culture of safety. … In the U.S. you can say a trial is voluntary, you have options, you can say no. But in the Third World, even if you read them their ‘Miranda rights,’ you apprise them of the risks, they have no options. They’re vulnerable because they have no other options for medical care.”
Drug companies and research institutions flocked to India in recent years, attracted by its large population, a wealth of well-trained and English-speaking doctors and the lower costs of conducting trials. But with that growth came concerns of exploitation and lack of oversight, which the government has been trying to address in recent years.
Kuritzkes, the Harvard researcher, and his colleagues had to make alterations to trials that were in progress when the Indian government mandated changes. One trial, of a tuberculosis drug, had to be completed in other countries without Indian participants, while the other two now appear to be back on track.
Research abroad remains necessary, he said, not just because it’s beneficial to researchers or drug companies.
“We’re trying to address problems in the countries we’re doing research in — what regimens are going to be available and effective in these parts of the world,” Kuritzkes said.
Sen believes there has been some positive change. “The public has become more aware that informed consent means something, that they’re taking a risk by enrolling in a trial, that they are protected and do not have to be intimidated.”
For all the improvement in protections for human subjects, there are those who say that laws and regulations have failed to keep up with changes in medicine and research.
Seema K. Shah, head of the NIH unit on international research ethics, said the last revision of regulations stemming from the research law came in 1991, before researchers used social media or could map the human genome, both of which raise privacy questions that remain unaddressed.
Japan, by contrast, revises its regulations every five years, said Shah, who is also on the faculty in the NIH Clinical Center Department of Bioethics.
“The goal in creating laws and ethical norms is to prevent the scandals of the past from happening in the future,” Shah said. “A lot of studies in the past prompted concern from the public and led to changes. But since there is no big crisis now, maybe some things aren’t being put into law.”
Issues raised by medical trials often make their way into the legal system, leading to questions of how to fairly compensate the injured. It’s unclear how many people are injured in studies in the United States because little data is collected by the federal government, and it is kept private.
But a study lead by NIH investigators found that few U.S. research institutions offered unconditional compensation to those injured, and those policies didn’t change much between 2000 and 2012.
In another paper published in the British Medical Journal in 2013, an NIH researcher concluded that reliance on the legal system left research subjects unprotected. It also left U.S.-sponsored multinational research at risk of delays because other countries have more robust requirements for compensation and insurance. The paper called for a no-fault compensation system in the U.S., like one created to compensate people harmed by approved vaccines.
When tragedy strikes in the midst of a trial, the repercussions can extend beyond the issue of victim compensation and lead to institutionwide changes.
Hopkins was shaken when Ellen Roche, 24, died in 2001 after inhaling an experimental chemical during a study of how healthy people’s lungs defend against asthma attacks.
The federal government briefly suspended virtually all of the medical institution’s experiments involving human subjects, an astonishing blow to Hopkins, which perennially garners the most research dollars in the nation. While the trials eventually resumed, federal officials faulted the researcher for failing to get proper approval of the experimental drug or to disclose its risks to volunteers.
Martinez, the Hopkins research participant advocate who has worked at the institution for about 30 years, remembers that “painful” time.
“She was one of us,” Martinez said of Roche, who had worked as a lab tech at Hopkins’ asthma center.
The crisis led to changes that make Martinez feel research subjects are much better protected. Hopkins increased the number of review boards and created the research participant advocate position that Martinez has held for eight years.
Much of her day is spent consulting with researchers, to ensure their proposals meet patient safety and consent standards. She also fields calls from participants who may have concerns.
But even she acknowledges the limits of protecting every participant in every study.
“Participating in research is never perfectly safe,” she said. “It wouldn’t be research if there wasn’t risk. You can’t take it away. But you can improve it.”