Abstract
In this paper, I describe the main lines of modern error theory, a systemic theory which regards errors not as the results of someone’s negligence, but as parts of a complex system. Bearing in mind that errors must be considered as such by an observer or attributor, I expose Wittgenstein’s conception of the attributor responsible for discerning if a strange event constitutes an error or an anomaly. Subsequently, I illustrate this conception of the attributor by describing some traits of the role played by many attributors in business organizations, specifically in the production line of a car manufacturer’s plant, when carrying out the rootcause analysis of problems that arise in the plant. Finally, I reveal a paradox whereby if the attributor denies the possibility that anomalies may happen without being able to explain their source within the system that he or she takes as reference, then the very idea of system no longer makes sense.
Similar content being viewed by others
Notes
Wittgenstein (PI 23) provided several examples of language-games: “Giving orders, and obeying them”, “Describing the appearance of an object, or giving its measurements”, “Constructing an object from a description (a drawing)”, “Reporting an event”, etc.
Errors can take many different forms. To give only a few examples, they can be due to oversights, omissions, inaccurate data or information, and slips of the tongue. But however strange and surprising errors may seem, our language-games always indicate how they can be discovered.
Wittgenstein (LE) uses the term ‘miracle’ in such a way that it seems to be a synonym for ‘anomaly’. As an example of a miracle, someone might not only grow a lion’s head, but also start roaring. A scientific investigation, according to Wittgenstein, would look at this unheard-of event not as a miracle, but as a fact that has still not been explained by science.
In order to develop a more fluent discourse, the masculine will henceforth be used to refer to the attributor, on the clear understanding that all such gender references equally apply to both men and women.
It is worth noting that Ohno (1988) does not distinguish between errors and anomalies, but merely refers to problems that must be solved. Even if the staff wanted to learn from errors, priority was given to production, as it is emphasized that the main benefit from this procedure was that lines stopped increasingly less frequently.
Henceforth, I will use the term ‘system’ to make reference to the whole of the procedures followed daily in the enterprise, especially in the production line and the quality department. Most importantly, such procedures already entail which events or possibilities are expected to occur – including errors – and which are not. Regarding those events that are not expected to happen, they constitute the infinite potential anomalies for which there is no room inside the system. As was to be expected, this system is based on our world-picture, which is extraordinarily broader because it includes aspects as basic as the meanings of our words and the most elementary physical laws, among many others.
In this paper, I have referred to error and anomaly management within the car industry. Nevertheless, there are other fields, such as chemical engineering, in which enigmas or apparent anomalies can be offset by changing the blend of materials and processes. Thus, there is an industry-specific aspect to my argument. I am grateful to an anonymous reviewer for making this interesting comment.
It is desirable that even young children already show a strong resistance to take events as anomalies: if this were not the case, their certainties – and, by extension, their very world-picture – would be easily and frequently called into question, e.g. when watching cartoons in which unheard-of events happen time and again (Ariso 2015, 2017).
As mentioned above, Wittgenstein (OC 469) noted that something which seems to lack meaning may stop striking us as meaningless once we find an explanation that helps us understand what has happened.
In this vein, Quine (1951) pointed out that our knowledge can be contemplated as a field of force made up by statements whose truth values must be adjusted when there is a conflict with experience at its periphery. Hence, one of Quine’s main assumptions is that this field must maintain its internal coherence by facing anomalies that take place beyond its edges, so that each new adjustment necessarily entails excluding some possibilities from the system.
References
Ariso, J.M. 2013. Wittgenstein and the possibility of inexplicably losing certainties. Philosophical Papers 42 (2): 133–150.
Ariso, J.M. 2015. Learning to believe: Challenges in Children’s Acquisition of a World-Picture in Wittgenstein’s On Certainty. Studies in Philosophy and Education 34 (3): 311–325.
Ariso, J.M. 2016. Can certainties be acquired at will? Implications for Children’s assimilation of a world-picture. Journal of Philosophy of Education 50 (4): 573–586.
Ariso, J.M. 2017. Negative certainty. Educational Philosophy and Theory 49 (1): 7–16.
Cannon, M.D., and A.C. Edmondson. 2005. Failing to learn and learning to fail (intelligently): How great organizations put failure to work to improve and innovate. Long Range Planning: International Journal of Strategic Management 38 (3): 299–319.
Chia, R. 2002. Essai: Time, duration and simultaneity: Rethinking process and change in organizational analysis. Organization Studies 23 (6): 863–868.
Clarke, D.L., J. Gouveia, S.R. Thomson, and D.J.J. Muckart. 2008. Applying modern error theory to the problem of missed injuries in trauma. World Journal of Surgery 32: 1176–1182.
Clegg, S., M. Kornberger, and C. Rhodes. 2005. Learning/becoming/organizing. Organization 12 (2): 147–167.
Dekker, S. 2006. The field guide to understanding human error. Hampshire & Burlington: Ashgate.
Descartes, R. 1986. Meditations on first philosophy: With selections from the objections and replies. Cambridge: Cambridge University Press.
Harteis, C., and K. Buschmeyer. 2012. Learning from mistakes – Still a challenge for research and business practice. In Learning from errors at school and work, ed. E. Wuttke and J. Seifried, 31–47. Opladen, Berlin & Farmington Hills: Budrich Verlag.
Harteis, C., J. Bauer, and H. Gruber. 2008. The culture of learning from mistakes: How employees handle mistakes in everyday work. International Journal of Educational Research 47 (4): 223–231.
Heinze, A., S. Ufer, S. Rach, and K. Reiss. 2012. The student perspective on dealing with errors in mathematic class. In Learning from errors at school and work, ed. E. Wuttke and J. Seifried, 65–79. Opladen, Berlin & Farmington Hills: Budrich Verlag.
Kuhn, T.S. 1970. The structure of scientific revolutions. Chicago: The University of Chicago Press.
Ohno, T. 1988. Toyota production system: Beyond large-scale production. Portland: Productivity Press.
Oser, F., and M. Spychiger. 2005. Lernen ist schmerzhaft. Zur Theorie des Negativen Wissens und zur Praxis der Fehlerkultur. Weinheim and Basel: Beltz Verlag.
Perona, Á.J. 2010. Reflections on language games, madness and commensurability. Wittgenstein-Studien 1: 243–260.
Quine, W.V.O. 1951. Two dogmas of empiricism. The Philosophical Review 60: 20–43.
Rasmussen, J. 1987. The definition of human error and a taxonomy for technical system design. In New technology and human error, ed. J. Rasmussen, K. Duncan, and J. Leplat, 23–30. Chichester: Wiley.
Reason, J. 1990. Human error. Cambridge: Cambridge University Press.
Strauch, B. 2004. Investigating human error: Incidents, accidents, and complex systems. Surrey & Burlington: Ashgate.
Taylor, D.H. 1987. The role of human action in man-machine system errors. In New technology and human error, ed. J. Rasmussen, K. Duncan, and J. Leplat, 287–292. Chichester: Wiley.
The New York Times. 2015. $900 Million Penalty for G.M.’s Deadly Defect Leaves Many Cold. http://www.nytimes.com/2015/09/18/business/gm-to-pay-us-900-million-over-ignition-switch-flaw.html?_r=0. Accessed 18 Sept 2015.
van Dyck, C., M. Frese, M. Baer, and S. Sonnentag. 2005. Organizational error management culture and its impact on performance: A two-study replication. Journal of Applied Psychology 90 (6): 1228–1240.
Weimer, H. 1925. Psychologie der Fehler. Leipzig: Klinkhardt, Leipzig.
Weimer, H. 1930. Fehlerbekämpfung. In Handbuch der Pädagogik, Band III: Allgemeine Didaktik und Erziehungslehre, ed. H. Nohl and L. Pallat, 119–128. Langensalza: Beltz.
Wittgenstein, L. 1977. Remarks on Colour. Oxford: Blackwell. (abbreviated as ‘RC’ throughout).
Wittgenstein, L. 1988. Zettel. Oxford: Blackwell. (abbreviated as ‘Z’ throughout).
Wittgenstein, L. 1997. On Certainty. Oxford: Blackwell. (abbreviated as ‘OC’ throughout).
Wittgenstein, L. 2001. Philosophical Investigations. Oxford: Blackwell. (abbreviated as ‘PI’ throughout).
Wittgenstein, L. 2014. Lecture on Ethics. Malden and Oxford: Wiley Blackwell. (abbreviated as ‘LE’ throughout).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Ariso, J.M. Should Business Organizations be Blind to Anomalies? On the Role of the Attributor in the Blurred Confines of Modern Error Theory. Philosophy of Management 17, 219–228 (2018). https://doi.org/10.1007/s40926-017-0075-9
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40926-017-0075-9