Chapter Abstract:
It is a common attitude to consider all technology good and think that any bad effects are simply the result of misuse. A corollary of this is to believe that the develop...Show MoreMetadata
Chapter Abstract:
It is a common attitude to consider all technology good and think that any bad effects are simply the result of misuse. A corollary of this is to believe that the development of new technology is always a net gain—that the world is always better off with access to a new tool. In this article, sociologist Charles Perrow makes a strong argument that both of these claims are false in many cases. Perrow's classic work is a 1984 book titled Normal Accidents in which he provides evidence for the idea that some sociotechnological systems can never be rendered 100 percent safe. Because of the complexities of the ways in which people, organizations, systems, and artifacts are intertwined, there can never be complete certainty as to how the system will behave and why. Perrow asserts that the technologies we choose to develop and use should be treated cautiously. Hubris may convince those who design technological systems that they have eliminated, or at least contained, all risk. But, according to Perrow, the possibility always exists that complexity will prove the builders wrong. Perrow urges society to reconsider the adoption of enormous systems that will result in enormous catastrophes not if they fail, but when they fail. Perrow's claim in Normal Accidents that we must design our technological and social structures with the risk of accidents in mind was heeded by some but not all. Decades later we see evidence of the truth of his argument playing out in large systems failures. The failure of the Fukushima Daiichi Nuclear Power Station in Japan in 2011, due to a series of events that "were never supposed to happen," sparked Perrow to revisit the warnings he gave in Normal Accidents.
Page(s): 303 - 310
Copyright Year: 2021
Electronic ISBN:9780262366274