Honolulu Star-Advertiser

Friday, October 11, 2024 79° Today's Paper


Top News

Lab manipulations of COVID virus fall under murky government rules

NIH / AP
                                This 2020 electron microscope image made available by the National Institute of Allergy and Infectious Diseases shows a Novel Coronavirus SARS-CoV-2 particle isolated from a patient, in a laboratory in Fort Detrick, Md.

NIH / AP

This 2020 electron microscope image made available by the National Institute of Allergy and Infectious Diseases shows a Novel Coronavirus SARS-CoV-2 particle isolated from a patient, in a laboratory in Fort Detrick, Md.

Scientists at Boston University came under fire this week for an experiment in which they tinkered with the COVID-19 virus. Breathless headlines claimed they had created a deadly new strain, and the National Institutes of Health rebuked the university for not seeking the government’s permission.

As it turned out, the experiments, performed on mice, were not what the inflammatory media coverage suggested. The manipulated virus strain was actually less lethal than the original.

But the uproar highlighted shortcomings in how the U.S. government regulates research on pathogens that pose a risk, however small, of setting off a pandemic. It revealed loopholes that allow experiments to go unnoticed, a lack of transparency about how the risk of experiments is judged and a seemingly haphazard pattern in the federal government’s oversight policy, known as the P3CO framework.

Even as the government publicly reprimanded Boston University, it raised no red flags publicly about several other experiments it funded in which researchers manipulated coronaviruses in similar ways. One of them was carried out by the government’s own scientists.

The Boston episode “certainly tells us the P3CO framework needs to be overhauled pretty dramatically,” said Angela Rasmussen, a virologist at the Vaccine and Infectious Disease Organization at the University of Saskatchewan in Canada. “The whole process is kind of a black box that makes it really difficult for researchers.”

The NIH said that every study it considers for funding is vetted for safety concerns by agency experts, who decide whether to escalate it to a higher-level dangerous pathogen committee.

Some experiments, though, either because they are conceived later on or because they do not rely directly on federal funds, end up falling outside the scope of that process, leading to confusion, biosafety experts said. And the rules could be overhauled soon. After months of meetings, a committee of government advisers is expected to deliver updated recommendations for such research by December or January, the agency said.

Evolving Rules

The government’s policy for such experiments is the Potential Pandemic Pathogen Care and Oversight, or P3CO framework. It was established five years ago in response to a set of contentious experiments in which researchers set out to transform an influenza virus that infected birds into one that could infect mammals.

Under the policy, the NIH and other agencies are supposed to flag grant applications for experiments that could potentially produce a new pandemic. Risky research may not be funded or may require extra safety measures.

Critics of P3CO have complained that this evaluation happens largely in secret and ignores projects that aren’t funded by the U.S. government. In January 2020, the government’s advisory panel, the National Science Advisory Board for Biosecurity, held a public meeting to discuss reforms. But subsequent meetings were canceled, ironically enough, because of COVID’s arrival.

In the months that followed, Republican politicians attacked the NIH for supporting past research on coronaviruses at the Wuhan Institute of Virology, suggesting that a lab leak there might have been responsible for the pandemic. (In July, Rasmussen and other scientists published studies pointing instead to a market in Wuhan as the origin.)

Under this growing scrutiny, the NIH’s advisory board met in February, worked on new recommendations over the summer and released a draft last month. It proposed expanding the scope of pathogens that can prompt a review beyond those that have a high fatality rate. Unlike smallpox or Ebola, COVID has a low fatality rate but is so contagious that it still wreaked global devastation.

In its ongoing discussions, the board has also considered the risk posed by computer software, such as programs that could figure out how to make a pathogen spread faster.

Researchers had mixed reactions to the new guidelines.

“The first draft makes some important advances and leaves a lot of things unaddressed,” said Marc Lipsitch, an epidemiologist at the Harvard T.H. Chan School of Public Health who has been pushing for tighter rules since the bird flu experiments more than a decade ago.

In comments submitted to the advisory board last month, Lipsitch and his colleagues said that proposed experiments must be justified by real, practical benefits rather than unsupported claims.

Other scientists, while welcoming clearer guidance, worried about onerous regulations that would bog down commonplace and innocuous experiments.

“Tell us what paperwork we need to fill out so we can do our jobs, which is to help the public respond to these types of things when they come at us,” said Robert F. Garry, Jr., a virologist at Tulane University.

Boston Experiments

The ambiguity of the government’s policy was laid bare this week when the news hit about the experiments at Boston University.

Mohsan Saeed, a virologist at the university, and his colleagues posted a report online aiming to understand the differences between omicron and other variants. The researchers made a new virus that was identical to the original version but carried an omicron spike. They then put the modified virus into a strain of mice that is very sensitive to COVID and widely used to study the disease.

Previous research had found that the original strain of COVID killed all of the mice. The new study found that the modified virus was less deadly, killing 80%.

Last Sunday, a story ran in The Daily Mail with a headline claiming that “scientists have created a new deadly COVID strain with an 80% kill rate.” The following day, an NIH official, Emily Erbelding, told the news site Stat that Boston University should have discussed the experiments with the agency ahead of time.

But, some researchers pointed out, the federal guidance is vague on what disclosures are required after a research proposal is approved. Science often takes unexpected turns, and officials do not generally apply the guidance to experiments that are conceived after funding has been granted.

“The government should be providing the guidance to help people figure this out,” said Gregory Koblentz, a biodefense specialist at George Mason University.

In a statement to The New York Times, Boston University said that the experiments were approved by its own safety committee as well as the Boston Public Health Commission.

The university also said its scientists were not obligated to notify the NIH because, although they had received government funding for related research, they used university funds to pay for the experiments in question. The agency said it is reviewing the matter.

The highly publicized dispute over technical laboratory protocols sent mixed messages to the scientific community and the public, said Syra Madad, an infectious disease epidemiologist at NYC Health and Hospitals.

“It seems like an epic communication failure,” said Madad, who is also on the National Science Advisory Board for Biosecurity. “This is why we’re revisiting the policy — to make sure that it’s clear, it’s transparent, it makes sense and it is operationally feasible.”

Madad and other experts agreed that the proposal for the Boston University experiments should have gone through a more rigorous evaluation. “​​In my opinion, that certainly looks like it meets the criteria for P3CO review,” she said.

But even if the study had gone through that process, some scientists said, it would have likely been given the green light.

Jesse Bloom, a virologist at the Fred Hutchinson Cancer Research Center, noted that the coronavirus is already rampant among humans and has evolved far beyond the variants used in the experiment. The hybrid lab virus would be unlikely to cause a serious threat if it escaped.

“I understand why it worries people because you are making a virus for which you can’t totally predict the properties,” Bloom said. “But this does not seem to me to be a particularly high risk.”

Similar Studies

The NIH’s stern public statements about Boston University’s research raised questions about the way it and other health agencies had assessed such experiments in the past. Last month, scientists with the Food and Drug Administration published a study in which they, like the Boston team, injected mice with coronaviruses engineered to carry an omicron spike.

The FDA is required to follow the P3CO rules. But the agency said in a statement that the hybrid virus created as part of its study did not amount to “a new version of the virus.” The study did not fall under the dangerous pathogen guidelines, the statement said, because “we set out to understand how the virus works, not identify new ways to make it more potent.”

Some independent experts said the agency’s rationale did not explain why the study passed muster: An experiment cannot bypass the approval process simply because the researchers did not intend to make a more dangerous virus.

“If it’s research that could be anticipated to possibly result in the enhancement of a potential pandemic pathogen — a more transmissible and/or virulent strain than exists in nature — it needs to be reviewed. Period,” Dr. Tom Inglesby, the director of the Johns Hopkins Center for Health Security at the Bloomberg School of Public Health, said in an email.

The FDA researchers are not the only American scientists to tinker with coronaviruses in this manner. At the University of Texas Medical Branch in Galveston, scientists have relied partly on federal funding for studies on whether vaccines generate protection against coronaviruses altered to carry omicron spikes.

Those techniques can save scientists months of waiting for samples of omicron viruses from human patients, allowing them to study the dangers of new variants and anticipate the need for booster shots. Outside experts said the Texas experiments were even less risky than the Boston study because they generally infected cells, not live animals, with the viruses.

While proposals from the Texas team would have been reviewed by the NIH, they were not escalated to the dangerous pathogen committee. The agency did not say why. (Since 2017, only three studies that the NIH proposed to fund were reviewed by that committee, it has said.)

“There is really no one in charge of scanning the medical literature, and it can be random events that bring these particular experiments to public attention,” Inglesby said. “And it shouldn’t be that way.”

Others raised a different problem: Research that isn’t funded by the government does not have to follow the government’s rules.

“I think that ultimately we would all agree that publishing a policy that would be broadly applicable would be ideal,” said Karmella Haynes, a biomedical engineer at Emory University and a member of the National Science Advisory Board for Biosecurity. “Now how to actually enforce that, I think, is beyond our charge.”

One possibility might be to come up with a policy modeled on the Federal Select Agent Program, which requires anyone seeking to work with certain dangerous substances, such as anthrax, to register with the government.

“Any recommendation that does not include codifying the requirements in regulations with the force of law will not add up to anything,” said Richard Ebright, a molecular biologist at Rutgers University.

Federal officials, he added, may be under pressure to strengthen oversight next year if Republican proponents of a crackdown win power in the midterm elections in November.

On the other hand, a politically fractious debate could put better regulations even further out of reach, some said.

“I worry about inhibiting our ability to understand these viruses that have killed millions of people,” said Gigi Gronvall, a biosafety specialist at the Johns Hopkins Bloomberg School of Public Health.


This article originally appeared in The New York Times.


© 2022 The New York Times Company

By participating in online discussions you acknowledge that you have agreed to the Terms of Service. An insightful discussion of ideas and viewpoints is encouraged, but comments must be civil and in good taste, with no personal attacks. If your comments are inappropriate, you may be banned from posting. Report comments if you believe they do not follow our guidelines. Having trouble with comments? Learn more here.