Skip to main content
Log in

Humanist and Nonhumanist Aspects of Technologies as Problem Solving Physical Instruments

  • Original Paper
  • Published:
Philosophy & Technology Aims and scope Submit manuscript

Abstract

A form of metaphysical humanism in the field of philosophy of technology can be defined as the claim that besides technologies’ physical aspects, purely human attributes are sufficient to conceptualize technologies. Metaphysical nonhumanism, on the other hand, would be the claim that the meanings of the operative words in any acceptable conception of technologies refer to the states of affairs or events which are in a way or another shaped by technologies. In this paper, I focus on the conception of technologies as problem-solving physical instruments in order to study the debate between the humanist and the nonhumanist ways of understanding technologies. I argue that this conception commits us to a hybrid understanding of technologies, one which is partly humanist and partly nonhumanist.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

Notes

  1. see for example, Humanist Manifestos I, II, and III published by the American Humanist Association.

  2. This dimension of the debate can also be studied in relation to the interaction between technologies and human intentions, actions, goals, and values, since human intentions, actions, goals, and values can be understood as constituents of human nature. Here, humanists would argue for the existence of purely human intentions, actions, goals, and values and nonhumanists argue that at least in any technologically mediated context, human intentions, actions, goals, and values do not exist in ‘nontechnological,’ purely human forms, and that technologies affect these and similar characteristics which are often assumed to be purely human.

  3. Some philosophers prefer to use the term ‘artifact’ instead of ‘technology.’ In most cases, I find the two terms referring to the same sort of objects. Perhaps the only difference between the two is that ‘artifact’ is derived from Latin words arte and factum and thus literally means something which is made with skill, but the word technology does not imply that the object needs to be made. Natural, unmodified, objects, for this reason can qualify as technologies but not as artifacts. Since my current discussion also includes objects like a piece of rock used as a hammer, a piece of rock used as a paper weight, a small leaf used to whistle with, a big leaf used as an umbrella, etc. I prefer to use the term ‘technology.’ This, albeit, does not mean that any natural object used in problem-solving activities would be a technology. For example, going from a sunny area to get warmth does not turn the sun into a technology. This is because the importance of the concept of usage in this conception of technology. Unlike the above examples, what the sun does (providing heat) is not the result of human usage. I shall discuss this point later on in this piece.

  4. Here, ‘energy loss’ does not mean that energy disappears. It means that not all the input energy transfers into the desired output. A machine with zero amount of energy loss, therefore, is one which is 100 % efficient.

  5. This can raise the question that if the robot cannot be the user of the dustpan and the brush, then who is the user of these technologies? Clearly, one way to answer this question would be to deny that the dustpan and the brush are technologies, but this answer is counterintuitive. A more plausible answer would be that in such a case, the user is the person who owns the robot and gets the robot to clean their room. It is similar to the case of a car which burns petrol. Does the car literally use the petrol? Or is it the driver who is using it? In this account of usage which assumes a level of mental capacity for the user, it is the driver who is (indirectly) using the petrol.

  6. For the difference between epistemic and metaphysical subjectivity, refer to (Searle 1995).

  7. Later on, I will discuss the possibility of the influence of technologies on our identification of a problem in more detail.

  8. One might claim that these human abilities are algorithmic in nature and take computational steps. One might even compare human upbringing with writing programs for computers. In this way, nothing purely humanist will remain in the conception of technologies as problem-solving physical instruments. I have two points to make in reply to these comments. First, people who make these remarks often consider progresses in the field of neuroscience as what backs up their claims. All I can say here is that neuroscientific discoveries are still far behind showing the algorithmic structure of human mind. And it is indeed upon those who make these points to provide scientific evidence for their claims. Secondly, another group of people who make these remarks are those who regard artificial intelligence as comparable to human intelligence. What can be said in reply to these people is that no meaningfully conscious computer has been developed yet. Although I do not think that computers can never have the same form of intelligence as humans, since the current generation of technologies do not have that intelligence, my humanist points about these technologies still holds.

References

  • Clark, A. (1999). An embodied cognitive science? Trends in Cognitive Sciences, 3, 345–351.

    Article  Google Scholar 

  • Clark, A. (2008). Supersizing the mind. Oxford: Oxford University Press.

    Book  Google Scholar 

  • Clark, A., & Chalmers, D. (1998). The extended mind. Analysis, 58, 7–19.

    Article  Google Scholar 

  • Floridi, L., & Sanders, J. W. (2004). On the morality of artificial agents. Minds and Machines, 14(93), 349–379.

    Article  Google Scholar 

  • Franssen, M. (2006). The normativity of artifacts. Studies in History and Philosophy of Science, 37, 42–57.

    Article  Google Scholar 

  • Himma, K. (2009). Artificial agency, consciousness, and the criteria for moral agency: what properties must an artificial agent have to be a moral agent? Ethics and Information Technology, 11, 19–29.

    Article  Google Scholar 

  • Holvast, J. (2009). History of privacy. IFIP Advances in Information and Communication Technology, 298, 13–42.

    Article  Google Scholar 

  • Houkes, W., & Vermaas, P. (2004). Actions versus functions: a plea for an alternative metaphysics of artifacts. The Monist, 87(1).

  • Johnson, D., & Powers, T. (2008). Computers as surrogate agents. In: J. van den Hoven, J. Weckert (Eds.), Information technology and moral philosophy (pp. 251–269).

  • Jones, O., & Cloke, P. (2008). Non-human agencies: trees in place and time. In C. Knappett (Ed.), Material agency: towards a non-anthropocentric approach (pp. 79–96). Malafouris: Springer.

    Chapter  Google Scholar 

  • Keulartz, J., et al. (2004). Ethics in technological culture: a programmatic proposal for a pragmatist approach. Science, Technology and Human Values, 29(1), 3–29.

    Article  Google Scholar 

  • Kirsh, D., & Maglio, P. (1994). On distinguishing epistemic from pragmatic action. Cognitive Science, 18, 513–549.

    Article  Google Scholar 

  • Latour, B. (pen name: Jim Johnson) (1988). Mixing humans and nonhumans together: the sociology of a door-closer. Social Problems, 35(3), 298–310.

    Google Scholar 

  • Latour, B., & Couze, V. (2002). Morality and technology: the end of the means. Theory, Culture & Society, 247–260.

  • Law, J., & Mol, A. (2008). The actor-enacted: Cumbrian sheep in 2011. In C. Knappett (Ed.), Material agency: towards a non-anthropocentric approach (pp. 57–77). Malafouris: Springer.

    Chapter  Google Scholar 

  • Malafouris, L. (2008). At the Potter’s wheel: an argument for material agency. In C. Knappett (Ed.), Material agency: towards a non-anthropocentric approach (pp. 19–36). Malafouris: Springer.

    Chapter  Google Scholar 

  • Norman, D. A. (1991). Cognitive artifacts. In J. M. Carroll (Ed.), Designing interaction: psychology at the human-computer interface (pp. 17–38). Cambridge, UK: Cambridge University Press.

    Google Scholar 

  • Searle, J. (1995). The construction of social reality. The Free Press.

  • Searle, J. (2007). Social ontology and the philosophy of society. In E. Margolis & S. Laurence (Eds.), Creations of the mind (pp. 3–17). Oxford: Oxford University Press.

    Google Scholar 

  • Sutton, J. (2002). Porous memory and the cognitive life of things. In: D. Tofts, A. Jonson, A. Cavallero (Eds.), Prefiguring cyberculture: an intellectual history (pp. 130–141). Power Publications and MIT Press.

  • Sutton, J. (2008). Material agency, skills and history: distributed cognition and the archeology of memory. In: C. Knappett, L. Malafouris (Eds.), Material agency: towards a non-anthropocentric approach (pp. 37–55). Springer.

  • Thomasson, A. (2007). Artifacts and human concept. In E. Margolis & S. Laurence (Eds.), Creations of the mind (pp. 52–73). Oxford: Oxford University Press.

    Google Scholar 

  • Verbeek, P. P. (2011). Moralizing technology. Chicago: The University of Chicago Press.

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sadjad Soltanzadeh.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Soltanzadeh, S. Humanist and Nonhumanist Aspects of Technologies as Problem Solving Physical Instruments. Philos. Technol. 28, 139–156 (2015). https://doi.org/10.1007/s13347-013-0145-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13347-013-0145-4

Keywords

Navigation