One of two now-notorious bird flu studies was published on Wednesday, four months after a panel of U.S. federal advisers asked researchers and scientific journals to hold on, just in case the information in the paper was dangerous.
In the paper, flu expert Yoshihiro Kawaoka of the University of Wisconsin (Madison) describes how he genetically engineered H5N1 bird flu—the virus that’s been decimating poultry in Asia, Egypt, and elsewhere—using pieces of the virus that caused the 2009 pandemic of H1N1 swine flu in people.
The paper itself is interesting to scientists and flu junkies. Published in Nature, it shows that four genetic changes can make the usually hard-to-catch H5N1 virus spread more easily among ferrets—animals that acquire flu in much the same way that people do.
“After wanting to read it for so long, it was like eating again after fasting,” Vincent Racaniello, a virologist at Columbia University, is quoted by Nature’s news section as saying. “And it does not disappoint.”
But the paper has historical significance because it’s the first big test of scientific censorship by the federal government. A second paper, by flu expert Ron Fouchier of Erasmus Medical Center in Rotterdam, the Netherlands, is being reviewed for publication in the rival journal Science.
The National Science Advisory Board for Biosecurity, which advises the Health and Human Services Department, said in December that publishing the papers could threaten national security and public health. It said the decision was too big for the scientific community to make on its own, comparing it to the 1940s Manhattan Project. The scientists, both of whom got funding from the National Institutes of Health, were not bound by the NSABB’s request, but reluctantly agreed. “It’s the first time they have seen results where they determined there should be restrictions,” Dr. Amy Patterson, executive director of the NSABB, told National Journal.
The decision divided the scientific community, with some researchers arguing that any censorship was dangerous and a waste of time, and others saying the viruses are too dangerous to risk tipping off terrorists or rogue governments about ways to unleash a deadly mutant.
After plenty of back-and-forth, including an emergency meeting at the World Health Organization in February and many editorials, the committee said there was no need for the government to intervene this time.
Since it started spreading in 2003, H5N1 bird flu has killed about 359 of the 600 people it is known to have infected—a mortality rate of 59 percent. This compares to a 2.5 percent fatality rate for the 1918 flu, which killed tens of millions of people, or 30 percent for smallpox before it was eliminated in 1979. Luckily, H5N1 doesn’t infect people easily, though it spreads rapidly through flocks of chickens. All flu viruses mutate, and most flu experts fear it is only a matter of time before H5N1 either evolves or mixes up with another flu virus to make a form that can easily infect people.
Kawaoka and Fouchier have been taking different approaches to see what new mutants might be possible—and whether genetic changes making a virus easier to catch might make it less deadly.
The Kawaoka paper is published in Nature with an unusual risk-analysis questionnaire that the journal points out comes from an agency outside the U.S. government. “Are there potential risks to public health from application or utilization of this information? If so, please describe,” it asks.
“There is no doubt that this information could be used by an exceptionally competent laboratory to provide the foundation for a program to develop a pandemic strain of this virus. There is no evidence that this reassortant virus would be fully pathogenic in humans,” is the answer. “A highly competent laboratory could exploit this information immediately, although as pointed out above, this paper does not provide sufficient information to produce fully competent dangerous pathogen.”
For its part, Nature’s editorial board says it will not edit future papers, or limit their distribution, based on fears that bad guys might use the information.
“First, it was worth deliberating at length on the possibility of redacting the key findings of the paper instead of simply rejecting it. (Rejection has long been an option if Nature is advised by security experts that the risks of publication exceed the benefits),” reads a commentary. “There was also the option that the full paper might be distributed by some third party, to selected recipients only. Having now considered these matters in depth, the editors of this journal have decided that we will not consider either alternative for papers in Nature in the foreseeable future. A paper that omits key results or methods disables subsequent research and peer review," it says.
"Furthermore, after much internal and external deliberation, we cannot imagine any mechanism or criterion by which to sensibly judge who should or should not be allowed to see the work. Nor do we believe that any restricted information distributed to university laboratories would stay confidential for long.”
Dr. Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases, which paid for the research, says the policies on reviewing so-called dual-use studies—those that can be used for either good or for evil—are not really limiting research. “I don’t think it is going to have an impact at all,” Fauci said in a recent interview.
“Out of 300 to 400 grants, we have just a handful of grants, really, that would rise to the level of needing additional scrutiny due to dual-use research. That doesn’t mean you can’t do the research—you just need to make sure you are meeting a checklist.”