Where is the borderline between freedom of inquiry and national security? Between the freedom to pursue – and publish – scientific results and the fear that these published results might be used to horrific effect? Or put another way, are there some lines of scientific inquiry that simply should not be undertaken at all because they are too dangerous, or that – once undertaken – should be withheld from publication?
The decision to develop nuclear weapons is a case in point – the need to develop these weapons seemed urgent, given indications that Germany was working towards the same end. Given the choice between a Nazi bomb and an Allied weapon the United States, Canada, and Britain chose to pursue these weapons with the results we all know. But after the war there was a choice – some Manhattan Project scientists felt that the safest way to use this knowledge was to make it freely available to everyone while others (and the government) saw these weapons as a possible threat to our nation’s survival and chose to lock the information away.
In hindsight it seems likely that nuclear weapons did not affect who won the Second World War – the German nuclear weapons program never made much progress and Japan would almost certainly have fallen with or without their use. The weapons almost certainly shortened the war but they did not change the ultimate outcome. Given this it is reasonable to ask if our Manhattan Project scientists might have better served humanity by simply refusing to do the work requested, or by working but deliberately not solving the problems they faced.
Tempting as it is to speculate that, absent the success of the Manhattan Project, the world would not have known the threat of nuclear explosions, such speculation is likely wrong – by the mid-1940s all of the basic science and the fundamental concepts were well-known. That being the case – and especially given the tensions of the nascent Cold War – it is almost certain that both the United States and the Soviet Union would have developed nuclear weapons within a few years of the war’s end. In 1992, I was present at an interview that Edward Teller gave to a Columbus, Ohio television station (sorry, can’t remember which one) in which he justified his leading the effort to develop thermonuclear weapons by the fact that the Soviets were working on them. Teller opined that such weapons were bound to be built and, that being the case (and given our opponent) far better for them to be developed first by the United States.
This decision may or may not have contributed to national security – we know that the Soviet Union embedded spies in the Manhattan Project during the development of both nuclear and thermonuclear weapons and that Soviet bomb developments lagged the American weapons by only a few years. But the information on how to develop nuclear weapons seeped out over the following decades – at present the big-picture details of how to build nuclear weapons are hardly a secret, although the fine points remain closely held secrets of the nuclear-capable nations.
For decades the world lived with the possibility of a civilization-ending nuclear war and it is hard to know if sharing nuclear weapons technology feely would have changed this for the better or for the worse. But this is a moot point because we have the world that was made over a half-century ago. On the other hand, in different fields, this question arises from time to time with a possible impact that could be every bit as consequential as the decision to pursue nuclear weapons.
Consider for example recent research by Dutch virologist Ron Fouchier reported in the journal Science. Fouchier and his team tinkered with an avian flu virus – H5N1 – in such a way that has the potential to make this not only deadly but also more infectious. An article in the November 23, 2011 edition of ScienceInsider quotes Paul Keim, a microbial geneticist who chairs the National Science Advisory Board for Biosecurity as saying that he “can’t think of another pathogenic organisms that is as scary as this one…I don’t think anthrax is scary at all compared to this.”
So the question is what to do about research such as this – a question with tremendous ethical implications no matter how you look at it.
- Is some research just too dangerous to undertake – should we (can we) purposely stifle some lines of inquiry?
- Should scientists refrain from undertaking research that can have such potentially devastating consequences?
- Should governments restrict this line of work so that it only takes place under the control of government scientists (and do we trust the governments to be the sole repositories of the fruits of such work)?
- If this work is done, should the information be made freely available to everyone with a subscription to the scientific literature or should it remain locked up by governmental secrecy rules?
There is a good reason that the very first amendment to the Constitution grants the freedom of speech – this freedom was considered essential to the proper functioning of any free society. In case after case the Supreme Court has upheld the freedom of speech, even when it includes burning the flag, publishing pornography, and contributing money to political campaigns. One can envision freedom of inquiry – including scientific inquiry – as being another form of freedom of speech and it seems reasonable to assume that this freedom of inquiry should be extended as far as possible. Obviously, the American Constitution cannot be applied to a Dutch citizen, but it does apply to the publishers of Science (an American journal), and it is worth discussing as a philosophical point in any event.
What is interesting is that, even in one of the most vital parts of our Constitution, the Supreme Court has carved out some exceptions. The most famous of these is that shouting “Fire” in a crowded movie theater is not considered constitutionally protected free speech, just as some forms of hateful speech or speech that incites others to violence is similarly off limits – in other words, we are not free to speak if that speech serves only to put others at risk.
So let’s think more about not only Fouchier’s research but about other similar work that we have seen in the past and are likely to increasingly see in coming years. Modifying a lethal virus to make it even more lethal – is this the equivalent of shouting “Fire” that serves no useful purpose other than to place people at risk, or is it something that should be protected?
Of course there is more to it than this – like with the science behind nuclear weapons, all of the basic science and techniques for modifying viruses are “out there” and the number of people who know how to use this science grows every year. It is silly to think that stifling Fouchier’s work will put a global halt to this research. Chances are that, even if Fouchier (and others around the world working on related research) stop their work it will simply be taken up by somebody else – possibly working more secretly. On the other hand, it is also possible that withholding publication of his results might delay this other work by several years, perhaps giving us the time to develop an effective vaccine against a bug that could kill many millions if it breaks out as an easily transmissible epidemic.
There is no easy answer to this question – for every argument in any direction there is a counter-argument. For example, we can argue that this work is too dangerous to be undertaken at all because the potential cost is too high; but we can counter-argue that this work will be undertaken by somebody. We can argue that publication should be suppressed to avoid giving a possible weapon to our enemies (the bad guys, of course!); but we can counter-argue that, given the near-inevitably of such research taking place, at least this way we can keep track of those working on it. Some of these issues are discussed in an interesting paper by Ronald Atlas, published in the March 3, 2006 issue of Science and Engineering Ethics titled Responsible Conduct by Life Scientists in an Age of Terrorism and a companion paper (The Dual-Use Dilemma for the Life Science: Perspectives, Conundrums, and Global Solutions, published in the September 25, 2006 issue of Biosecurity and Bioterrorism: Biodefense Strategy, Practice, and Science) Atlas suggests that such information should be freely available to the global scientific community, but urges a “culture of responsible conduct” on the part of life scientists to minimize the chances that such work might be “hijacked for hostile misuse” by terrorists or irresponsible and hostile nations.
Again – there is no easy answer but, given that this sort of topic seems to arise fairly regularly (the publication of the 1918 influenza genome and work done by Australian scientists Samantha Robbins and several colleagues to help viruses evade the immune system to name just two), perhaps we should try to come up with the difficult answer.
Dr Y is a certified health physicist, trained in nuclear power plant design and operations, with experience in nuclear power, environmental science, and planning for radiological and nuclear emergencies. He has 30 years of experience in the areas of nuclear and radiation safety.