Hostname: page-component-848d4c4894-m9kch Total loading time: 0 Render date: 2024-05-11T11:43:00.813Z Has data issue: false hasContentIssue false

Introduction: Deterrence and Disarmament

Published online by Cambridge University Press:  01 January 2020

David Copp*
Affiliation:
University of Illinois at Chicago, Chicago, IL60680, U.S.A.
Get access

Abstract

Image of the first page of this content. For PDF version, please use the ‘Save PDF’ preceeding this image.'
Type
Introduction
Copyright
Copyright © The Authors 1986

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1 These estimates are taken from Freeman, Harold, ‘The Effects of Nuclear War,’ in Sterba, James P., The Ethics of War and Nuclear Deterrence (Belmont, CA: Wadsworth Publishing Company 1985), 68-79Google Scholar, at 71, 79. Also see The Effects of Nuclear War, by the United States Office of Technology Assessment (Washington, D.C.: U.S. Government Printing Office 1979).

2 Bracken, Paul, The Command and Control of Nuclear Forces (New Haven, CT: Yale University Press 1983), 48-73Google Scholar, especially 53-65. The quotations are from pp. 53, 58, 65. It is obvious that estimates of the risk of failure of complex systems are highly fallible. Two examples: first, two months before the April 1986 accident at the Chernobyl nuclear power plant, Vitali Sklyarov, the Ukrainian Minister of Power and Electrifications, was quoted as saying that the chance of a meltdown in a Soviet nuclear power reactor is one in 10,000 years. The Chernobyl accident was described as a meltdown in Time, May 12 1986, 35. Mr Sklyarov was quoted in the February 1986 U.S. edition of Soviet Life, according to Maclean's magazine, May 12 1986, 26. Second, studies of the chances of a fatal accident to the United States's space shuttle, an accident that would result in loss of the shuttle or the crew, ‘resulted in widely contradictory results, ranging from one engineering estimate of one chance of disaster in every 100 flights to NASA's own estimate of one chance in every 100,000 flights’ (New York Times, June 4 1986, 11). In light of the 1986 Chernobyl and shuttle disasters, it seems unlikely that the optimistic claims about safety were correct.

3 The basic strategic situation is not changed by introducing ‘flexible response’ or ‘war-fighting’ strategies that are intended to avoid the risk that any conflict with the opposing superpower will immediately escalate to all-out nuclear war. Bracken argues that these strategies are unrealistic because of command and control problems, and because of the serious risk of escalation in times of major crisis. See Command and Control, especially chapter IV. In any event, the threat of escalation would remain the major deterrent to the use of nuclear weapons, and the major incentive to end any war between the two superpowers as soon as possible.

4 Sagan, Carl, ‘Nuclear War and Climate Catastrophe: Some Policy Implications,’ Foreign Affairs 62 (1983), 257-92.CrossRefGoogle Scholar Also see ‘Less Drastic Theory Emerges on Freezing After a Nuclear War,’ by James Gleick, New York Times, June 22, 1986, 1.

5 A simple consequentialist perspective would seem to imply that nuclear retaliation cannot be justified. But a threat loses its point if it is not credible, and it is not credible if it does not appear sufficiently likely that it will be prosecuted. Therefore, a consequentialist justification of deterrence would have to imply a justification for a system that makes it credible that there will be retaliation for a nuclear attack, despite the consequentialist objection to retaliation. See below, notes 11 and 12, and the related discussion in the text.

6 See Hobbes, Thomas, Leviathan ed. Macpherson, C.B. (Harmondsworth, England: Penguin 1968), chapter 13, 188.Google Scholar

7 Hobbes, Leviathan, chapter 13, 183-4 and 186

8 Rawls, John, A Theory of Justice (Cambridge, MA: Belknap Press of Harvard University Press 1971)Google Scholar

9 Gerald, Dworkin imagines various types of deterrence with various intentions. See his ‘Nuclear Intentions,’ Ethics 95 (1985), 445-60.Google Scholar

10 A similar thesis is quite plausible in the theory of punishment.

11 Unlike a purely retaliatory second-strike, a counter-force attack might have the advantage for a nation under attack of reducing the damage that would be caused by any subsequent attacks on it. A consequentialist would have to weigh the damage which is certain to be caused by a counter-force attack against this possible benefit.

12 Parfit, Derek asks, ‘Could it be right to cause oneself to act wrongly?’ See Reasons and Persons (Oxford: Clarendon Press 1984), 37-40Google Scholar

13 Russell Hardin argues that a nation interested only in defense might continue to develop new systems out of fear that the other side will overtake it technologically and achieve a first strike capability.

14 Related questions arise in the theory of punishment. See Hurka, Thomas, ‘Rights and Capital Punishment’, Dialogue 21 (1982)CrossRefGoogle Scholar; and Quinn, Warren, ‘The Right to Threaten and the Right to Punish‘, Philosophy and Public Affairs 14 (1985), 327-73.Google Scholar Quinn argues for the two part thesis that ‘the right to establish a genuine threat is prior to the right to punish.’ That is, first, ‘the right to set up the threat can be established without first raising the question of the right to punish and, second, … the right to the threat implies the right to punish’ (360). If this were true in general, then if a system of deterrence could be justified, say on grounds of the right of self-defense, the right to nuclear retaliation would follow. Quinn tries to avoid this conclusion (366, n. 46), but his strategy for doing so commits him to giving up the second part of his thesis.

15 I would like to thank Gerald Dworkin, Russell Hardin and Kai Nielsen for helpful comments and criticisms.