Skip to main content
  • Comment
  • Published:

A case of the flu

By the time you read this, the papers may be out and the moratorium may be over, though I hope not. (I'm not speaking of a moratorium on debates among Republican presidential candidates - there isn't one, although God knows we need it.) I'm referring to the 60-day moratorium, announced on January 20th this year, on certain types of experiments that could be carried out on the genome of the influenza virus. Thirty-nine prominent flu researchers agreed to this voluntary suspension of research following disclosure of results of new studies on a potentially deadly strain of influenza - studies that a number of scientists, laypeople, and public officials feared could be used to create a powerful bioweapon. That moratorium included a temporary embargo on the publication of certain experimental details of those studies, but on Friday 17 February, the World Health Organization (WHO), which sponsored the work, announced that the full details would be published within a few months.

To understand how this came about, it's useful to review the modern history of US government regulation of scientific research and its publication. It begins with a meeting of molecular biologists at the Asilomar Conference Center in Pacific Grove, California, in 1975. This rustic setting has served as the site for many scientific symposia over the years, but the meeting in 1975 was convened by (as Robert Sinsheimer, one of the organizers, put it) "a bunch of academics - focused, idealistic, and often naïve - trying to do good, struggling to reconcile our conflicts, our apprehensions, our scientific ambitions our careers, our sometimes murky sense of obligation and emerge with a practical resolution." As I wrote 10 years ago, in a Genome Biology column entitled An Asilomar Moment, the resolution they were trying to reach was how to move forward safely with the then newly developed recombinant DNA technology.

The earliest reports of techniques that allowed foreign genes to be expressed in bacteria had raised a chorus of alarms, both from professional Luddites and concerned citizens. Some molecular biologists themselves were also worried that they might accidentally produce dangerous microbes. Many more were worried that the growing fear would cause the government to prohibit recombinant DNA experiments altogether. And so a number of them met in Asilomar to figure out what to do, pledging to refrain from such experiments voluntarily until a consensus was reached on how to do them safely. The meeting was attended by both scientists and members of the press - a clever move because it guaranteed that the scientists would be able to make their case directly to the public. Over three-and-a-half days, the group of about 150, which included most of the leaders in the emerging field, debated the risks, known and unknown, of cloning and manipulating foreign genes and expressing them in bacteria. The meeting ended with a series of resolutions that set forth guidelines for the safe conduct of recombinant DNA experiments. The resolutions were given force by linking their compliance with obtaining federal funds for any such research. Sinsheimer later said that this result was "a middle ground...too restrictive for some, insufficiently restrictive for others...but Asilomar surely helped in many ways to launch the complex world of biotechnology we know today." The Asilomar resolutions headed off any draconian - and possibly misguided - regulation by the government, and reassured the public that the biologists not only would police themselves, but also would make public safety a key concern in future research.

And that was pretty much the story until the fall of 2001, when the whole world changed. After the terrorist attacks on 11 September, a series of deaths from anthrax-laced letters led to a wave of concern about bioterrorism, and Congress created the National Scientific Advisory Board for Biosecurity (NSABB). Among other tasks, the Board was charged with overseeing issues arising from the publication of biological research that could conceivably be misused by terrorists. Although the overarching philosophy was that information should be freely available unless there was a high probably of danger from its dissemination, the Board can recommend to the US Government, typically in the form of the National Institutes of Health, a partial or total embargo on any publication. The government agency then requests that the relevant journal(s) accede to that request. Note that there is no explicit requirement that the journals do so. NSABB hotly debated publication of the sequence of the 1918 strain of the flu several years ago but suggested no redaction. As far as I know, the first time they have done so was this December, when, in a highly publicized statement, NSABB called on the journals Nature and Science to censor the publication of two papers dealing with, in effect, the first steps toward what could be termed the weaponization of the H5N1 strain of influenza virus.

Influenza strains are named according to the particular genotype of two proteins on the viral surface: the neuraminidase (N) and the hemagglutinin (H). The 1918 strain that killed at least 20 million people worldwide was H1N1, as is the (thankfully) milder version circulating this winter. Most seasonal flu strains have low mortality rates and are dangerous primarily to the elderly and infirm. H5N1, also known as avian flu or bird flu, is a different beast altogether. It is not easily transmissible from birds to people, and apparently not transmissible person-to-person, but when it does infect a human being, it often kills: almost two-thirds of the 570 recorded human cases requiring hospitalization have been fatal (for comparison, the case-fatality rate of the most virulent strain of smallpox is around 30%). There is considerable controversy over the true case-fatality rate for H5N1, because no one is certain how many symptomless infections there are, but there seems little doubt that, relative to other flu strains at least, H5N1 is probably unusually pathogenic in humans.

The two papers in question - one from a group headed by Ron Fouchier of the Erasmus Medical Center in Rotterdam, the other from Yoshihiro Kawaoka's lab at the University of Wisconsin - described how a small number of amino acid changes in genes in H5N1 could produce a strain that was transmissible between people through the air (the workers actually showed transmission between co-caged susceptible ferrets, but when it comes to influenza, ferrets are pretty good surrogates for humans). Complete experimental details were given in the submitted manuscripts, including the specific substitutions. The NSABB argued that such details were too useful for would-be terrorists and should be disseminated only among carefully vetted flu researchers; the broad conclusions were deemed safe for publication.

After some debate, the studies' authors agreed to the censoring of the information, and joined with a number of their colleagues in calling for the abovementioned 60-day moratorium on such research. The stated purpose of the moratorium was to give scientific organizations and governments time to formulate policies regarding these and similar experiments. Then, on 16-17 February, a group of 22 experts from around the globe met at WHO headquarters in Geneva (I guess the accommodation there is nicer than at Asilomar) to discuss the matter, and in the end most of the attendees agreed that the hypothetical risk of the data being used by terrorists as part of a program to weaponize flu was outweighed by the need to understand how highly virulent strains of flu might emerge in the wild, and to share information that could be used to identify that such a strain was beginning to develop. (Interestingly, the US delegation was not part of this consensus: they agreed with the NSABB recommendation, and wanted the work published in redacted form.)

Many scientists are concerned that any censoring of scientific publications constitutes the slipperiest of slopes. Censorship is the first power that totalitarian regimes seek to acquire, and the scissors, in the memorable phrase of Leslie Charteris, can easily grow large enough to snip off heads. Other researchers assert that it's impossible to restrict the flow of information in this internet-dominated age, so why even try? And then there is the argument made by the WHO group - also made by the papers' authors themselves - that it is important to publish the results so that the appearance of such a lethal strain can be identified early and effective vaccines and drugs against it can be developed in advance. On the other hand, there is the simple argument that the risks of such research vastly outweigh the benefits, and that it is foolish, to say the least, to make a would-be terrorist's job so much easier.

My own view is that even these arguments miss the point. Asilomar should have taught us that limited restrictions we as a community devise ourselves are far better than an assertion of freedom that is likely to be met with stiff governmental regulation or the erosion of public trust in science. Ignoring the public's concern, even when it might be overblown, also smacks of arrogance and ivory-tower blindness - attitudes certain to lead to calls for severe government controls. We may not enjoy policing ourselves, but we're far better off when we do.

How realistic are those concerns? To be honest, I'm not certain that the work in question is going to be that interesting to terrorists either - at least, not to the better-known organizations. They want to control the world, not obliterate it, and they tend to prefer technologically simple, chemical-based weapons that can be targeted to specific populations and institutions. One does need to worry about doomsday cults whose objective is to bring about the end of the world. They are not numerous but they can be lethal, as in the case of Aum Shinrikyo, the Japanese group responsible for the release of the nerve gas sarin in a Tokyo, Japan, subway station in 1995. Before that chemical attack, the cult is known to have dispersed aerosols of anthrax and botulism throughout Tokyo on at least eight occasions. Unlike the sarin attack, the biological attacks failed to produce any illness, but the reasons for this failure are, troublingly, still unclear.

Furthermore, in considering the effectiveness of restricting access to information, it's also important to remember that big crime, including terrorism, is often an inside job. Recall that the anthrax letters were sent by a trusted bioweapons expert who worked for the US government. He would almost certainly have been given access to the details of the H5N1 experiments had he asked for them.

Consider, too, that the US has pretty much cornered the market on vaccine against the H5N1 strain, owing to fears by the Bush administration that an outbreak might be imminent in 2005. So if you're a terrorist bent on destroying the US, would it make much sense to use a biological weapon that your primary target is already prepared for?

Moreover, from the perspective of a sometime structural biologist, the whole discussion seems rather naïve about protein structure and function. It is well established that there are many different amino acid combinations that produce the same protein fold with the similar physical chemical properties. The likelihood that one set of substitutions obtained in the laboratory represent the only way - or even the most probable way - to make H5N1 human transmissible by air, when the virus evolves in the wild, is, I would think, extremely small. The probability that watching for those specific mutations would alert us to the outbreak of such a strain is therefore also small.

As you see, one goes back and forth about an issue this complicated. Yet, in the end, I keep coming back to two things. The first is that, when it comes to public discourse, perception is at least as important as reality, and we ignore it at our peril. Public perception of science has shifted in recent decades from one of the scientist as savior to one in which we are seen by many to be self-absorbed glory-hounds, more interested in the pursuit of discovery and the rewards it brings than we are in the consequences our work may have for the well-being of society. What Frankenstein didn't do to our reputation, the atomic bomb and years of antiscience propaganda from certain politicians and religious leaders has done - and if you doubt me, watch any science fiction movie or monster/disaster film.

And those movies, while ridiculous, do have a sort of point to make, in that in the eye of many scientists, there is something almost erotic about dangerous work, not to mention the near certainty that such work will be published in the sexiest science journals.

So I think the moratorium was a good thing, overall, and I would have recommended that it be continued even longer - and that the paper be published without the sensitive information - until we as a community find ways to reassure the public that we not only can police ourselves but that we should. It's the best strategy for protecting our right to inquire, our freedom to explore, and our ability to communicate with each other the results of the work that we do. If that means we sometimes have to err on the side of caution, I don't think the price is, at present anyway, too high to pay. As geneticist Janet Westpheling so eloquently put it, the only rules that work are the ones scientists honestly believe in as necessary and are willing to enforce themselves.

The second thing is that we are human beings and it's silly, and possibly wrong, to pretend that we can, or even should, conduct our professional lives in a moral vacuum. We should always question the purpose and ultimate result of what we do, before others ask those questions for us.

Considering all these points, I must say that the argument that the possible harm to the public perception of science from work like this probably outweighs any hypothetical benefit from freely disseminating the information strikes me as a pretty good argument, and one I have yet to see refuted. But it also seems to me, in light of the arguments I have made here, that this debate should not be about publishing or not (which is probably moot when the information has been disseminated so widely already and no terrorist is likely to want it). The issue we as a community should be debating is whether this particular experiment should have been performed at all.

My final take on the flu controversy? The H5N1 researchers might have been within their rights to carry out the work they did, and might even have been within their rights to publish it. But I wouldn't have done either of these things.

(Note: I have benefitted from a number of insightful comments and critiques on the draft of this column from well informed and highly placed friends. Those facts and ideas that are valid and valuable are due in large measure to their input, for which I am very grateful. They are in no way responsible for anything that might be incorrect or silly - those are on me.)

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gregory A Petsko.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Petsko, G.A. A case of the flu. Genome Biol 13, 146 (2012). https://doi.org/10.1186/gb-2012-13-2-146

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/gb-2012-13-2-146

Keywords