Conditions for Belief and Knowledge

  • James Russell


In the previous chapter we were led to the paradoxical conclusion that the computational theory of mind (CTM) may result in a position that is at once dualistic and behaviourist. Whether it does so or not depends on how dogmatically its proponents interpret the strong AI program (see Chapter 3). If, as Pylyshyn counsels, they abandon their faith in the computational ‘metaphor’ of the mind as a kind of salvation-bringing new paradigm,1 interpret their claims literally (for example, thought as formal symbol manipulation) and then ensure that programs are constrained by empirical validation of their choice of functional architecture and algorithms (that is, that they stay close to the evidence) such consequences may be avoided. But, as I said, it is not at all clear what form these constraints should take.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Notes and References

  1. 3.
    For a clear exposition: R. A. Boakes and M. S. Halliday, ‘The Skinnerian analysis of behaviour’, in. R. Borger and F. Cioffi (eds), Explanation in the Behavioural Sciences (Cambridge: Cambridge University Press, 1970).Google Scholar
  2. 5.
    For a discussion see D. C. Dennett, Brainstorms (Brighton: Harvester, 1979) pp. 122–4.Google Scholar
  3. 10.
    For example: D. Bobrow ‘Dimensions of representation’, in D. Bobrow and A. Collins (eds), Representation and Understanding (New York: Academic Press, 1975).Google Scholar
  4. 14.
    H. A. Simon, Models of Thought (New Haven: Yale University Press, 1979) pp. 63–4.Google Scholar
  5. 20.
    For a thorough exposition of this see F. Jacob, The Logic of Living Systems (London: Allen Lane, 1974).Google Scholar
  6. 25.
    For a similar argument see R. Harris, ‘Discussion’, in: S. C. Brown (ed.) Philosophy of Psychology (London: Macmillan, 1974) pp. 274–6.Google Scholar
  7. 29.
    Fodor ‘The mind—body problem’, Scientific American Jan. 1981, p. 131.Google Scholar
  8. 32.
    Fodor, ‘Methodological solipsism’ Behavioural and Brain Sciences 1980, vol. 3.Google Scholar
  9. 45.
    S. Körner, Kant (Harmondsworth: Penguin, 1955) p. 85.Google Scholar
  10. 49.
    T. Nagel, ‘What it is like to be a bat’, Philosophical Review, 1974, 83, pp. 435–51.CrossRefGoogle Scholar
  11. 57.
    See D. W. Hamlyn, ‘Human learning’ in S. C. Brown (ed.) Philosophy of Psychology (London: Macmillian, 1974).Google Scholar
  12. 59.
    P. F. Strawson, Individuals (London: Oxford University Press, 1959).CrossRefGoogle Scholar
  13. 61.
    P. L. Berger and T. Luckman, The Social Construction of Reality (Harmondsworth: Penguin, 1971).Google Scholar
  14. 63.
    D. W. Hamlyn, ‘Person perception and understanding others’, in T. Mischel (ed.), Understanding Other Persons (Oxford: Blackwell, 1974).Google Scholar
  15. 64.
    D W Hamlyn, ‘Cognitive systems, folk psychology, and knowledge’, Cognition 1981, 10, pp. 115–18, extract from p. 118.Google Scholar
  16. 65.
    For example, G. A. Miller, E. Gallanter and K. H. Pribram, Plans and the Structure of Behaviour (New York: Holt, Rinehart & Winston, 1960).CrossRefGoogle Scholar
  17. 66.
    G. Miller, Comments on Pylyshyn’s paper, The Behavioural and Brain Sciences, 1980, 3, p. 146.Google Scholar
  18. 68.
    G. Miller, ‘Trends and debates in cognitive psychology’, Cognition 1981, 10, pp. 215–25, extract from p. 222.CrossRefGoogle Scholar
  19. 69.
    H. C. Longuet-Higgins, ‘Artificial intelligence–a new theoretical psychology?’, Cognition, 1981, 3, pp. 197–200.CrossRefGoogle Scholar

Copyright information

© James Russell 1984

Authors and Affiliations

  • James Russell
    • 1
  1. 1.University of LiverpoolEngland

Personalised recommendations