Normal Accidents of Expertise

Minerva 48 (3):239-258 (2010)
  Copy   BIBTEX

Abstract

Charles Perrow used the term normal accidents to characterize a type of catastrophic failure that resulted when complex, tightly coupled production systems encountered a certain kind of anomalous event. These were events in which systems failures interacted with one another in a way that could not be anticipated, and could not be easily understood and corrected. Systems of the production of expert knowledge are increasingly becoming tightly coupled. Unlike classical science, which operated with a long time horizon, many current forms of expert knowledge are directed at immediate solutions to complex problems. These are prone to breakdowns like the kind discussed by Perrow. The example of the Homestake mine experiment shows that even in modern physics complex systems can produce knowledge failures that last for decades. The concept of knowledge risk is introduced, and used to characterize the risk of failure in such systems of knowledge production.

Similar books and articles

Systems thinking for knowledge.Steven A. Cavaleri - 2005 - World Futures 61 (5):378 – 396.
Knowledge is normal belief.B. Ball - 2013 - Analysis 73 (1):69-76.
Model for knowledge and legal expert systems.Anja Oskamp - 1992 - Artificial Intelligence and Law 1 (4):245-274.
Knowledge-based systems.Klaus Mainzer - 1990 - Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 21 (1):47-74.
Mechanisms in Dynamically Complex Systems.Meinard Kuhlmann - 2011 - In Phyllis McKay Illari, Federica Russo & Jon Williamson (eds.), Causality in the Sciences. Oxford University Press.

Analytics

Added to PP
2010-12-11

Downloads
410 (#46,336)

6 months
97 (#41,412)

Historical graph of downloads
How can I increase my downloads?