On July 8, the Centres for Disease Control and Prevention announced that vials containing smallpox were discovered in a cold storage room in an unauthorised National Institutes of Health facility in Maryland. The disease is so infectious and dangerous that an international agreement mandates that only two labs in the world can possess it.
Nobody could say what the smallpox was doing in Maryland or how it had evaded detection for so many years. Just a month earlier, the CDC had revealed another alarming breach of protocol, announcing that they might have accidentally exposed more than 75 employees to anthrax.
Fortunately, it seems that no one was inadvertently infected during either of those events. But both those potential exposures reveal one important thing: Even at the safest, most secure labs in the country, something can go wrong.
We’re lucky, some researchers say, that these slip-ups didn’t happen with a flu virus — like the one that killed roughly 50 million people in 1918.
That strain of influenza, and others like it, are being actively studied at multiple labs in the U.S. and around the world. A mistake at any one of those labs, some researchers say, would be a real disaster.
If a researcher had caught anthrax, that would have been tragic — but it almost certainly wouldn’t have spread further, as anthrax is generally not considered contagious. Smallpox is more contagious, but with vaccination and antiviral drugs, even that could be much more easily than an accidental influenza release, according to Michael Osterholm, the director of the University of Minnesota’s Center for Infectious Disease Research and Policy.
Flu viruses that are both deadly and highly transmissible are not only being studied but are also being created in labs with increasing frequency.
The H1N1 influenza virus was responsible for the deadliest pandemic in modern human history. That same virus was first re-created in a lab in 2005 in order to understand what made that pandemic so deadly.
At the time, critics argued that re-creating the virus and making the genetic sequence available online would make it more likely that other labs would copy that virus or engineer similar influenza pathogens, increasing the number of potential sites where an accidental release could potentially occur.
No such accident has occurred, but that prediction by critics was right. More researchers have indeed begun to engineer similar deadly influenza viruses, according to Marc Lipsitch, a professor of epidemiology and director of the Center for Communicable Disease Dynamics at the Harvard School of Public Health.
“Most of the labs that are doing this are among the very safest labs in the world,” Lipsitch told Business Insider. “But so is the biosafety level three lab at the CDC where they were doing the anthrax work. Safety isn’t all about machines or ventilation, it’s also about human judgment.” Because of that, he argued, even the strictest containment protocols are vulnerable to human error.
And as more institutions in more countries begin similar research, there’s no guarantee that the same high containment standards will be followed.
The Value of Dangerous Research
Lipsitch’s argument is not that this research won’t reveal important science — all experiments can provide new insight, both in terms of what researchers are looking for and in what they discover that they weren’t looking for.
Instead, he says the rationale used to create these influenza pathogens doesn’t justify the risk incurred by creating them.
We’ve been able to create vaccines to protect against existing influenza viruses without first creating the viruses themselves in the past, and so there’s no indication that developing a new virus is necessary to improve vaccination research.
As for genetic sequencing, some of these viruses have had their genomes sequenced already, and Lipsitch argues we are no more prepared to stop a pandemic as a result. In the cases of active viruses currently out in the world, like H7N9, genetic sequencing hasn’t led to action to stop its spread.
Some of the researchers creating these viruses say that doing so and having access to their genomes will help us understand how they become so transmissible and deadly. But even in cases where we can pinpoint some of the genes involved in transmission, we can’t necessarily predict what that information means for viruses in the wild. We’re still learning how to interpret the thousands of characters that make up the genetic sequence for any virus.
Of particular concern to critics of flu research are “gain of function” studies, where researchers take deadly new influenza viruses that people don’t have immunity to and make them even more contagious.
Studies of this nature are currently happening in the Netherlands and in Wisconsin. The scientists creating these viruses say that this research is essential for understanding how viruses adapt to become transmissible in mammals.
A Worrisome History
Incidents similar to the anthrax one — and worse — have happened before.
In 2004, a Maryland lab sent live anthrax to a California children’s hospital. In a New York Times op-ed on the risks of some current flu studies, Lipsitch points out that “between 2003 and 2009, there were 395 ‘potential release events’ and 66 ‘potential loss events’ in American labs involving select agents, a category that includes many of the most lethal bacteria and viruses.”
Considering whether such mistakes could actually lead to widespread infection is not just hypothetical. In fact, the best current explanation for a 1977 H1N1 outbreak in China and Russia is that the virus escaped from a lab.
As recent news events demonstrate, sometimes human error can defeat the most stringent safety protocols. Even though it’s extremely unlikely, an accident could happen with a lab-grown flu virus, too.
If it does, we’ll have to ask ourselves if it was worth it.
NOW WATCH: Briefing videos
Business Insider Emails & Alerts
Site highlights each day to your inbox.