Tag Archives: scope insensitivity

Alastair Norcross

There may be a temptation to regard one life as trivial when compared with seven million. What difference will a choice of life or death for Smith make when compared with the millions who will surely die whatever you choose? Or perhaps we could say that it is not so much that one more life is trivial compared with several million, but rather than morality should not have anything to say about such a difference. Bernard Williams could be taken to be describing such a view when he talks of a moral agent for whom ‘there are certain situations so monstruous that the idea that the processes of moral rationality could yield an answer in them is insane: they are situations which so transcend in enormity the human business of moral deliberation that from a moral point of view it cannot matter any more what happens’. Williams constrats such a view with consequentialism, which ‘will have something to say even on the difference between massacring seven million, and massacring seven million and one’. One can certainly sypmathize with the agent who is so horrified at the scale of a massacre that she fhinds it difficult to deliberate rationally in the circumstances. This does not, however, support the view that from a moral point of view it cannot matter anymore what happens. If there really is no moral difference between massacring seven million and massacring seven million and one, the allied soldier arriving at Auschwitz can have no moral reason for preventing the murder of one last Jew before the Nazi surrender. The Nazi himself can have no moral reason for refraining from one last murder. While Williams’s moral agent is berating the universe for transcending the bounds of rationality, the consequentialist is saving a life. It is not hard to guess which of these agents I would rather have on my side.

Alastair Norcross, ‘Consequentialism and the Future’, Analysis, vol. 50, no. 4 (October, 1990), p. 255

Eliezer Yudkowsky

The Spanish flu of 1918 killed 25-50 million people. World War II killed 60 million people; 107 is the order of the largest catastrophes in humanity’s written history. Substantially larger numbers, such as 500 million deaths, and especially qualitatively different scenarios such as the extinction of the entire human species, seem to trigger a different mode of thinking—enter into a ‘separate magisterium’. People who would never dream of hurting a child hear of an existential risk, and say, ‘Well, maybe the human species doesn’t really deserve to survive.’

Eliezer Yudkowsky, ‘Cognitive Biases Potentially Affecting Judgement of Global Risks’, in Nick Bostrom and Milan M. Ćirković (eds.), Global Catastrophic Risks, Oxford, 2008, p. 114