This is a draft that was writen a long time ago, in 1990-91 when I was Mellon Postdoctoral Instructor in Philosophy at The California Institute of Technology, and at a time when very few people were seriously discussing consciousness. (An even earlier version was presented at the Fourth Annual Conference of the History and Philosophy Section of the British Psychological Society - Lincoln, U.K., April 1990.) Both the abstract and the paper itself are essentially as I left them in 1991. I am no longer sure that I would want to defend everything I say in them, and I would certainly express and organise things somewhat differently today. However, I think that the basic idea is still viable, and I hope eventually to return to it and rework it.
The principal temptation toward substance dualisms, or otherwise incorporating a question begging homunculus into our psychologies, arises not from the problem of consciousness in general, nor from the problem of intentionality, but from the question of our awareness and understanding of our own mental contents, and the control of the deliberate, conscious thinking in which we employ them. Dennett has called this "Hume's problem". Cognitivist philosophers have generally either denied the experiential reality of thought, as did the Behaviorists, or have taken an implicitly epiphenomenalist stance, a form of dualism. Some sort of mental duality may indeed be required to meet this problem, but not one that is metaphysical or question begging. I argue that it can be solved in the light of Paivio's "Dual Coding" theory of mental representation. This theory, which is strikingly simple and intuitive (perhaps too much so to have caught the imagination of philosophers) has demonstrated impressive empirical power and scope. It posits two distinct systems of potentially conscious representations in the human mind: mental imagery and verbal representation (which is not to be confused with 'propositional' or "mentalese" representation). I defend, on conceptual grounds, Paivio's assertion of precisely two codes against interpretations which would either multiply image codes to match sense modes, or collapse the two, admittedly interacting, systems into one. On this basis I argue that the inference that a conscious agent would be needed to read such mental representations and to manipulate them in the light of their contents can be pre-empted by an account of how the two systems interact, each registering, affecting and being affected by developing associative processes within the other.
A recurrent problem with trying to make scientific, materialistic accounts of the mind plausible is that they so often seem to fail to address one of the most salient feature of mentality, our actual conscious experience, or awareness, of thinking. It is surely at least in part because of this failure that substance dualisms - which essentially remove the mental from the realm of natural science - retain a perennial appeal. In this paper I want to try and tackle this problem, but I should make it clear that I am not attempting to suggest a solution to 'the problem of consciousness' as a whole. In fact I am sympathetic to those recent authors1 who have attempted to suggest that "consciousness" does not name one, but several, perhaps only loosely related, problems.
One of these is simply how experience is possible at all. In Nagel's2 celebrated phrase, it is very puzzling how it can be "like something to be" oneself. I have no solution to offer to this very deep problem. What I will, however, observe, is that substance dualisms do not seem to be any better placed to deal with it than does materialism. It is just as hard to understand why it should be "like something to be" an immaterial soul, as it is for an entirely material organism. The mysterious nature of soul itself may foster the illusion that dualism helps with this problem, but all it really provides us with is a compounding of mysteries.
Awareness or consciousness does seem to imply the necessity of a subject, of someone or something to be aware or conscious. But in the case of simple perceptual consciousness of our surroundings dualism is still not called for. The subject can plausibly be taken to be the whole person or organism, perhaps as registering the things in its environment in internal representations. That, of course, begs the question of how these representations get to be representations, how they acquire their intentionality3, but in what follows I am simply going to assume that this problem is somehow soluble; after all, lots of very clever people are busy studying it. All I will add is that, once more, dualism does not really help with the matter. To say that symbols in the brain represent because the soul takes them as representing (as symbols in the external world represent because people take them so to do) simply transforms the problem into that of how the soul acquires its intentionality - and we understand souls even less than brains.
The legitimate temptation to dualism arises over our awareness of the internal representations themselves4, which becomes an issue when we want to understand thinking rather than mere perception. Descartes, after all, arrived at his dualism by affirming his awareness of his thoughts whilst holding his awareness of his surroundings in question5. I (a whole person) am conscious of the things around me, but who is there to be aware of the things in me, to register my thoughts, and who manipulates my mental representations when I am actually thinking with them? Dennett6 has called this "Hume's problem". How can some inner I, some homunculus, be avoided? It will not do to think of the same whole person looking inward at representations just as he may look outward at objects, because the representations in question, and their vicissitudes, partly constitute this whole person. A person is not just the container for mental representations. Some sort of duality may indeed be demanded here - a representation can scarcely be aware of itself7 - and, perhaps because of this, most materialistically minded philosophers and psychologists tend to avoid the issue of conscious thought these days8. I want to sketch a sort of 'dualism' which can meet the problem without landing us with question begging homunculi9 or metaphysically otiose 'immaterial substances'10.
The deliberate avoidance of conscious thought was, of course, quite open during the reign of Behaviorism, which ruled the inner life out of science altogether. Surely an important part of the original appeal of cognitivism was that it seemed to take our experience of mental life much more seriously, and even promised to furnish a scientific account of it11. As against the Behaviorists, the Cognitivists seemed to be saying that people really do have conscious experiences, thoughts, ideas, wishes and the like, and that these really are causally efficacious in producing our behavior - just as ordinary people had thought all along. However, as cognitive science has developed, especially as it has generally come to be characterized by philosophers, it seems to have begun to lose touch with the roots of its appeal in common mental experience.
This may seem particularly clear for eliminative materialist characterizations, which project the future of the science of the mind as progressively moving away from the explanatory categories of "folk psychology"12. However, even philosophers, such as Fodor, who envisage "folk" categories as continuing to play a large role in Cognitive Science do not give much of a role to the mental representations that we actually experience. It is "folk" explanations, not the experiences of folk, which Fodor thinks are valuable. After all, he likes to think of the beliefs, desires, etc. which are supposed to determine our behavior as being represented in "mentalese", the "language of thought", a sort of machine code of the neural computer13, but in no way would he suggest that we experience our thoughts as being in mentalese. Presumably the internal representations we do experience, mental images, for example, and beliefs and desires 'silently spoken' in English, or whatever is our native natural language, must be regarded as non-functional, and thus scientifically rather uninteresting, epiphenomena of underlying computations in mentalese14. These remarks about Fodor would seem to apply equally to all those who regard mental representation as essentially computational in nature, whether or not they have such a strongly linguistic conception of it, and whatever their regard for "folk psychology". For example, the "s-representations" which Cummins15 thinks are required by a computational theory of the mind are, if anything, even less like ordinary conscious thoughts than are mentalese expressions - they do not even carry intentionality. The distinction between computational cognitive representation (which surely ought not to be called mental representation) and conscious representation is, I think, well taken; after all, conscious thinking (unless, perhaps, when we are doing mental arithmetic) certainly does not seem like computing. Computational processes are automatic and operate entirely with the syntactic properties of the symbols involved16. Semantic properties are, at best, carried along for the ride17. Deliberate conscious thought, on the other hand, seems to depend on awareness of the semantic properties of the representations involved, and deliberate manipulation of them in the light of their actual content. It is quite a standard position in cognitive psychology to associate consciousness both with the "short term memory store" in which currently active representations are held, and with the "central executive" function which manipulates them whilst they are there18. This picture is routinely combined with the computational metaphor, but it is never made clear why the "central executive", the 'main loop' of the cognitive program, as it were, should be conscious of what it is doing whilst other program modules are not. My point is not to deny that, at some level, there might be a useful story to be told about cognitive processes in computational terms, either serial or connectionist. What I do want to deny is that this will suffice for an account of the deliberate, conscious thinking we experience. The temptation into which cognitive science has tended to fall has been to implicitly dismiss conscious thought as a delusive play of epiphenomena. But this still leaves the question: who is being deluded? And who today sees epiphenomenalism as a serious improvement over Cartesian dualism?19
To express the problem as we have been, as a matter of who is conscious of representations, who manipulates them, is, of course, to beg the question in favor of homuncularism. Unfortunately it is very awkward to express the matter otherwise; a dualistic 'folk' theory of thought, in which a conscious inner agent understands and manipulates inner representations just as a whole person may understand and manipulate outer ones, seems to be virtually built into our language. Philosophers, of course, are well versed as to the question begging nature of conscious homunculi in theories of mind, and I suspect that the failure of cognitivist philosophers to confront the experience of deliberate conscious thought, one with which they ought to be familiar, has been conditioned by a well motivated fear of the little people. However, truth to experience, although it does seem to call for conscious representations, does not call directly for the inner agent. As Hume20 pointed out, there is never any experience of the inner agent (there could not be; the 'mirror of nature' cannot reflect itself). Its existence is inferred analogically, and the inference can be resisted. If it has been difficult to resist, once the premises have been admitted, then that is because no clear alternative has hitherto been available. Developments in psychological theory, originally proposed to meet purely empirical psychological problems, may now have presented us with the basis for such an alternative.
Psychologists have somewhat different concerns from philosophers. If a theory is intuitively appealing and empirically powerful they may well not be in such a hurry to reject it just because it looks as if it may turn out to be metaphysically problematic. Consequently, not all cognitive psychologists have been as wary as cognitivist philosophers of allowing conscious mental representations into their theories. One such psychologist is the Finnish Canadian, Allan Paivio, who has put forward, and, over many years defended, what he calls the Dual Coding theory of mental representation21.
Dual coding theory is in essence very simple and intuitive - perhaps too much so to have caught the imagination of the philosophical community22. Nevertheless, it is taken very seriously by psychologists, and its obviousness should not be allowed to blind us to its merits. Like many of the best ideas it is what we all already sort of knew (how could an account of conscious experience be otherwise?) made, at last, concrete and explicit enough for us to be able to get sufficient purchase on it for experimental testing, criticism, and the development of its implications: "What oft was thought but ne'er so well express'd"23. Essentially what it asserts is that there are two distinct and quasi-independent formats for mental representation, two 'codes' or representational systems. These are verbal representation, and representation in mental imagery. The theory seems to have been developed initially to explain the powerful effects of imagery in verbal learning experiments, and it was probably Paivio's work in this area which did more than anything else to re-establish the legitimacy of the mental image as a psychological concept24 after its eclipse by Behaviorism25. The two basic effects are that the deliberate formation of mental images relevant to verbal material greatly enhances memory for the words26, and that, even without deliberate imaginative effort, words which easily arouse imagery are better remembered than ones which do not. These are two of the most powerful and best attested effects in verbal memory research, and are readily accounted for in Dual Coding terms by the hypothesis that, when imagery is aroused, there are relevant memory traces, associatively linked together, laid down in each of the representational systems, but where imagery is not aroused only one, verbal, trace is established. This very simple27 theory has stood up very well to a great deal of empirical work over the past twenty years or so28, and has been successfully extended to other areas such as picture memory29 and chronometric studies of mental comparisons of sizes, distances and other dimensions of variation30. It would also seem to be strongly supported by experiments which seem to show that visuo-spatial tasks interfere far more with visuo-spatial image representations than do verbal tasks, and vice-versa31. Also relevant is Kosslyn's work showing that, when imagery is involved, relative size is a dominant factor influencing speed of reporting of features of objects from memory, whereas, in the absence of imagery a different factor, verbal association strength, supplants it32. The many lines of evidence which have been assembled in support of mental imagery as a distinct cognitive function must also count in its favor33.
Of course, the theory is by no means universally accepted, but so far as I can tell there is no strong empirical motive for rejecting it. The various attempts over the years to provide some alternative explanation for the empirical results seem, rather, to have been motivated by just the sort of metatheoretical doubts about the scientific status of conscious mental representations which we have already discussed. This is quite explicit in Richardson's book34. His own attempt to substitute "concreteness" for Paivio's parameter of "imagery value" as a factor in the memorability of words proved, as he acknowledges, an empirical failure, but he continues to hope that some other alternative parameter will be found. Paivio notes that over twenty such putative alternatives had already been eliminated before 197135.
Dual Coding theory is certainly not just intended as a hypothesis accounting for some rather recondite (if unusually robust) results from the psychological laboratory. Paivio clearly thinks of it as the basis for a general account of cognitive processes, and has tried to illustrate its fruitfulness in discussions of its relevance to such matters as creativity, the different styles of thinking involved in various tasks36, and various issues in the psychology of language37. The present essay may be considered an additional attempt along these lines, but before we can apply the theory to the problem of conscious thought some potential misunderstandings and ambiguities must be cleared up. In the first place it may be thought that Paivio is actually proposing too small a number of 'codes'. I know of at lest one psychologist, Walter Kintsch38, and one philosopher, Owen Flanagan39, who have said so. They take it that there ought to be a distinct imagery code associated with each sense modality, so, taking the traditional count of five senses, Flanagan argues that Paivio ought to be committed to a "six-code" theory. Whilst certainly not embracing it, Paivio does not reject such a notion quite as firmly as I think he could and should40. The view would seem to derive from an understanding of imagery which both he and I would wish to reject. This is the Empiricist notion that mental images (or "ideas") are simply remnants, echoes or reproductions of former sensations, and that representational systems are to be individuated on the basis of their sensory, experiential quality rather than on theoretical or operational grounds, as Paivio has always insisted41. Perhaps Paivio is vulnerable to this sort of misunderstanding because he has never put forward a detailed theory of the underlying nature of imagery. However, he has explicitly rejected what he calls the "wax tablet" metaphor, from which, of course, Hume's talk of "impressions" and of "ideas", "the faint images of these in thinking and reasoning"42, derives. Paivio in fact regards imagery as "a dynamic process more like active perception than a passive recorder of experience" and as "a dynamic symbolic system capable of organizing and transforming the information we receive", and he warns us against taking the metaphor of "mental pictures" too seriously43, a point with which I heartily concur44. Of course, the term "image" is itself a metaphor, and a visual one at that, but this may be no more than a reflection of the fact that, for human beings, sight is normally the most salient aspect of perceptual experience. I can see no reason, apart from Empiricist prejudice, why our mental images should not be regarded as representing total, multimodal perceptual experiences45. This, after all, is what our actual perceptual experience is like. Furthermore it is also a quite traditional understanding of imagery. In Aristotelian46, medieval and post-medieval47 psychologies the imagination was either identified with, or located downstream from, the faculty of the "common sense" where the deliverances of the various external senses are brought together and integrated into a meaningful whole. There are no good grounds for multiplying the number of imagery codes.
George Baylor48 and Stephen Kosslyn49, speak favorably of Dual Coding theory, but they regard the verbal code as being what is often called a "propositional" representation, that is a representation in something like the innate, unconscious "language of thought" which Fodor proposes. This is not Paivio's view50. (A related view, proposed by Marschark and others51, is that both the imagery and the verbal representations which we consciously experience are produced from long term memory representations in a more basic, underlying 'propositional' code. Although neither I nor Paivio actually believe this52, from the point of view of our current argument it is quite acceptable in that it places both the image and the verbal representations on the same level.) I, also, want to reject any such interpretation of dual coding, in which the duality is understood as being between images, and quasi-linguistic representations in the format of a hypothesized underlying 'propositional' system. We are, after all, concerned here with mental representations of which we are conscious, and 'propositional' or 'mentalese' representations are, as already noted, not something of which we are ever directly aware. Paivio is always quite clear that what he is talking about are verbal representations, representations in English or whatever language one happens to speak.
However, neither Paivio53 nor I54 hold that we are necessarily, ipso facto, conscious (at least in the sense under consideration) of every imaginal or verbal representation which arises during our cognitive functioning. (If we were, much of the central problem of this paper - the aspect of consciousness, if not of deliberate thought - would be solved by fiat.) Mental representations are to be picked out as such, in the first place, as explanatory scientific constructs, not as givens of experience55. (That Richardson explicitly takes the opposite view56 seems to largely determine his rejection of Dual Coding theory, somewhat in the teeth of the empirical evidence he reviews.) The point is that we are sometimes conscious of our mental images and mental words (just when will emerge later) and that all of them are the sort of representation of which it is possible to become conscious57.
I doubt that anyone would seriously deny that we do experience covert verbal representation - 'inner speech'. Even J.B. Watson failed to disbelieve in it58. What might seem questionable is its status as functionally significant, and still more as an independent representational system. For, in fact, Paivio regards all mental representations as being sensorially derived, with verbal mental representations themselves consisting of images of public words, probably auditory images of spoken words59. We might now seem to have arrived at the opposite extreme to Flanagan, with, instead of a six-code, a single code theory. I want to insist, however, that Paivio has been quite right to fix on the number two, even if he has not always managed to articulate the most satisfactory reasons for doing so. There is a conceptual distinction to be drawn between the two representational systems, not in terms of sensory mode or any other experiential quality, nor even (as Paivio attempts) on the somewhat obscure basis of the systems' functional integration60, but rather in terms of the way they do their actual representing, of how they refer to their objects. Quite simply, a mental image represents what it is an image of, a mental word represents whatever it is a word for. The mental image of a weasel, be it visual, tactile, auditory, olfactory, or all four, represents a weasel; the mentally represented word "weasel" is the image (auditory and/or kinaesthetic) of the spoken word "weasel", but it does not normally represent the spoken word "weasel", it represents a weasel61. We thus have a logical distinction between two types of representational system, which provides a far firmer basis for the duality of coding than any purely contingent, empirical distinctions62 could ever give us.
Incidentally, this distinction between codes on the grounds of how they refer is quite independent of the question of where and how the intentionality or meaningfulness gets into our representational systems. There seem to be several possibilities: (i) images and words hold their meanings independently63; (ii) both images and words derive their intentionality from that of an underlying, non-conscious, perhaps 'propositional', system64; (iii) words ultimately derive their intentionality from associated images65; (iv) imagery derives its meaning from that of language66. On any of these views, the difference in the referential relationships between images and words and their objects remains, and our main argument is unaffected. As I have said, I do not propose to offer any solution to the problem of intentionality in this paper.
What I have promised to give you is a suggestion as to how we can be aware of our mental representations, and how they can be deliberately used, not as meaningless counters but as representations, in our thinking. I have promised to do this without the aid of an inner agent, a ghost in the machine, to be aware of them and of their meanings, and to manipulate them accordingly. To put it very bluntly, my proposal is that instead of there being an inner conscious agent to register and manipulate our mental representations, the contents and events within each of our representational systems are registered and partially controlled by the other system. Consciousness is not the overseer of the mind, but emerges from the spontaneous cooperation of two independent intentional systems of mental representation.
Of course, two homunculi are worse than one, but no homunculi are called for here. As Dennett67 notes, Hume's solution to "Hume's problem", the problem of semantically driven thought68, lay in the mechanism of the association of ideas, the ideas entraining one another. Hume got rid of the homunculus alright, the reason why his theory is, in Dennett's words "a notorious non-solution" to the problem is not that it is question begging or even that it is false. Surely association does take place, we are often aware of it. The trouble is that it is seriously insufficient to explain the facts it confronts. On the one hand it cannot account for consciousness because, without a homunculus, there is nothing outside the 'imagination' in which the Humean ideas float to register their presence there, and, on the other, while thought may involve trains of association, that is certainly not the whole story. Not all thinking is daydreaming; sometimes our thought is considerably more structured and directed. But as two legs would seem to be the minimum requirement for walking, two associative systems working together may be able to achieve a more than quantitative advance over the potentialities of one69. The mistake of Empiricism (well one of them, anyway) was to essentially treat all thought as imagistic, with language being treated, for the most part, as just an input/output mechanism. The twentieth century's pervasive 'lingualism'70 seems to me to be a falling into the converse error.
Paivio has always insisted that his two "codes" are "richly interconnected". This, indeed, is a further reason why I think he needs something like my conceptual distinction between the codes if he is to avoid them collapsing into one. These connections may themselves be merely associative, but the point is that the image of a weasel can arouse the word "weasel", and the internal representation of the word "weasel" can likewise arouse the image of a weasel. At a rather higher level we can describe our mental images to ourselves just as we can describe a real scene out loud, and likewise we may form a mental image on the basis of a verbal description.
Associative processes within a single representational system probably go on even more smoothly than those between systems. Undoubtedly, in Humean fashion, we can have associative trains purely of mental imagery. However, I would suggest that if this were all that we had going on in our minds we would not have any awareness of it, at least in the sense with which we have been concerned. One image follows another and the previous ones are lost. The succeeding image does not register or in any way represent the one before; at best it is caused by it. However, if an independent representational system is registering what is going on, if a verbal description of the images, or even of the succession of images, is being associatively produced, then we have an awareness of our thoughts, or even of our train of thought. Likewise, a train of verbal thoughts can be registered, integrated and fixed for us in an appropriate image71.
Furthermore, on the basis of this sort of awareness, dual coding can account for how we are able to think in a deliberate, directed way, how we are able to keep our thoughts on the subject and 'manipulate' our representations. Simple associative trains of imagery are liable to drift off in any direction, but if, for instance, an image representing something that particularly concerns us becomes fixed in our minds, the verbal representations which are produced are likely to remain more or less relevant to that image, until, perhaps, one of them provides a solution to the problem which the image embodies. In a similar way, the imagery system may fall under the control of the verbal. Relevant images may accrete around some particular verbal formula that is held in mind, that we are 'rehearsing', as psychologists say (there is reason to think that images may be rehearsed, and thus temporarily 'fixed', retained in short term memory, just as verbal material can72). Perhaps in truly successful cases of directed thinking, control passes back and forth between the two systems, producing a developing but coherent line of thought. An analogy for this might be a mountain climber hauling himself hand over hand up a rock face, hanging on alternately with either hand as the other gropes about for a new place to hold on to. Two hands are needed for this. Or perhaps we should think of the climber working his way up a vertical crack, a chimney, bracing himself first against one side and then the other. But it is important not to identify the climber with the thinker; rather his path represents the developing chain of thought. The mind is the mountain itself.
An interesting consequence of our Dual Coding account of thinking would be that non-language-using animals, although they may, as Aristotle says, have imaginations73, would be incapable of directed thought. The intellectual difference between them and human beings would be qualitative rather than quantitative. This is not, perhaps, a surprising result, but it would not seem to fall so naturally out of computational (including connectionist) accounts of cognition. We might find it a more surprising consequence that such animals, in an important sense (though not every sense), would not be conscious. Descartes, of course, would not have been surprised. He was very sensible of the connection between language and consciousness74, he just got the dependency reversed.
But returning to humans, perhaps the very fact of a more than associative coherence within the sequence of representations in one system would itself be available for registration within the other, and this could be a powerful contributory factor towards the homunculus error. The signs of something being in control of events within each system could well come to be represented in the other, yet, as neither system is a homunculus, capable of self-awareness, neither can be aware of itself registering and controlling the other. The signs of 'someone', some controller, being there are unmistakable, but it is never seen. It is an easy but faulty inference that there is a single, unobserved and unobservable agent responsible for the ordering of mental representations in general.
If, as is usual, we regard our mental representations as all belonging together in one undifferentiated system then it is hard to explain the coherence of (some of) our thought processes, and hard to explain even our awareness of them except in the question begging terms of an agent who is able to know them and to use them to construct arguments and conclusions. Rejecting that, the fashion recently has been to explain thinking as the work of a computer. Computers contain no homunculi, but neither do they work in images or English, nor do we have any understanding of how they might be conscious of anything, in any sense. Even if we should someday manage to comprehensively account for the relationship between human inputs and outputs in computational terms, even if we can map this account onto neural processes, we will still have failed to give any account of human mental life75. I am not saying that brains could never be mimicked by computers, nor that 'artificial intelligences', artificial minds, are necessarily an impossibility76. What I am saying is that the program and architecture of such a computer would not amount to a psychology77. Rather it would provide a substrate or medium for a psychology, as brains do for human psychology. Understanding thought and our experience of it requires an explanation at the higher, psychological, level, and what I have tried to do in this paper is to delineate the problem and to sketch how a genuinely psychological theory might have the conceptual resources to tackle the job.
1. E.g. T. Natsoulas: (1978), "Consciousness." American Psychologist, 33, 906-914; (1983), "Addendum to 'Consciousness'." American Psychologist, 38, 121-122; (1986-87), "The Six Basic Concepts of Consciousness and William James's Stream of Thought." Imagination, Cognition and Personality, 6, 289-319. N. Nelkin: (1987), "What is it Like to be a Person?" Mind and Language, 2, 220-241; (1989) "Unconscious Sensations." Philosophical Psychology, 2, 129-141.
2. T. Nagel, (1974), "What is it like to be a bat?", Philosophical Review, 83, 435-450.
3. Some (e.g. R. Cummins, (1989), Meaning and Mental Representation. MIT Press: Cambridge, MA.) may think these are separate problems; I follow the herd in thinking they are the same.
4. I think this is more or less equivalent to what Natsoulas, op. cit., calls consciousness4 (see further, T. Natsoulas, (1983), "A Selective Review of Conceptions of Consciousness with Special Reference to Behavioristic Contributions." Cognition and Brain Theory, 6, 417-447), and what Nelkin, op. cit., calls C2.
5. F.E. Sutcliffe (trans. & ed.), (1968), Descartes: Discourse on Method and the Meditations. Penguin: Harmondsworth.
6. D.C.. Dennett, (1978), Brainstorms. Harvester: Hassocks, pp.101, 122.
7. Dennett's remarks (1978, op. cit. pp.102, 124) about "self-understanding" data structures notwithstanding.
8. D.C. Dennett (1989, The Intentional Stance. MIT Press: Cambridge, MA, p.x) has noted this remarkable state of affairs in contemporary philosophy of mind. He can claim to be an exception (see, e.g., Dennett: 1969, Content and Consciousness. Routledge & Kegan Paul: London; 1978, op. cit., part 3; 1981, "Wondering Where the Yellow Went." The Monist, 64, 102-108; 1982a, "How to Study Human Consciousness Empirically: or, Nothing Comes to Mind." Synthese, 53, 159-180; 1982b, "Why We Think What We Do About Why We Think What We Do." Cognition, 12, 219-227), and he promises us more on the topic, but I am not sure that Dennett's problem of consciousness is precisely the one we are going to be concerned with here.
9. It has been pointed out (by F. Attneave, (1961), "In Defence of Homunculi." in W.A. Rosenblith (ed.), Sensory Communication. MIT Press & Wiley: New York; and Dennett, (1978), op. cit.) that homunculi in psychological explanation need not always be question begging, but the use of conscious homunculi to explain consciousness would certainly seem to be so.
10. The advantage of the metaphysical move is that it camouflages the begging of the question, and shuts off the threatening regress with a mystery.
11. See, e.g., M.A. Boden, (1977), Artificial Intelligence and Natural Man. Harvester: Hassocks, pp. 393ff.
12. Paul M. Churchland (Scientific Realism and the Plasticity of Mind. Cambridge University Press: Cambridge, 1979) seems to envisage that psychological experience itself will be transformed as psychological science develops and replaces "folk" notions. In that case utopian psychology would give an account of mental experience, just not of our (present day) mental experience. But even in this scenario the problem of who is having the mental experiences would remain. I want to try and sketch a solution to it drawing on the conceptual resources of current psychology, rather than waiting on the millennium. If substance dualism is going to be necessary anyway then the whole Churchland picture of the "co-evolution" (P.S. Churchland, 1986, Neurophilosophy. MIT Press: Cambridge, MA.) of psychology and neuroscience away from "folk" conceptions becomes much less compelling - brains just won't be that central.
13. J.A. Fodor, (1975), The Language of Thought. Crowell: New York.
14. Fodor, op. cit. chap.4, does seem to allow that mental images might sometimes do some real cognitive work, but it turns out that their capacity to function as representations at all is going to depend on some mentalese description in the light of which they are interpreted.
15. Cummins, (1989), op. cit..
16. As theorists otherwise as far apart as Searle, Fodor, Stich and Dennett seem to agree: J.R. Searle, (1980), "Minds, Brains and Programs." The Behavioral and Brain Sciences, 3, 417-424; J.R. Searle, (1990), "Is the Brain a Digital Computer?" Proceedings and Addresses of the American Philosophical Association, 64(3), 21-37; J.R. Fodor, (1981), Representations. MIT Press: Cambridge, MA; S.P. Stich, (1983), From Folk Psychology to Cognitive Science: the case against belief. MIT Press: Cambridge, MA; Dennett, (1989), op. cit..
17. As on the upper deck of a London bus, perhaps - c.f. Cummins, op. cit..
18. See T.H. Carr, (1979), "Consciousness in Models of Human Information Processing: Primary Memory, Executive Control and Input Regulation." in G. Underwood & R. Stevens (eds.), Aspects of Consciousness, Vol.1: Psychological Issues. Academic Press: London. A version of this approach particularly relevant to the current paper is to be found in P.E. Morris & P.J. Hampson, (1983), Imagery and Consciousness. Academic Press: London. I recommend their model, gratis, to the attention of radical analysts of the ideological status of science, who should find it of great value - see, particularly p.58.
19. The answer to this latter question is Keith Campbell: (1970), Body and Mind. Macmillan: London.
20. L.A. Selby-Bigge & P.H. Nidditch (eds.), (1978), David Hume: A Treatise of Human Nature. Oxford University Press: Oxford. pp. 232f. (Original, 1739).
21. A. Paivio: (1969), "Mental Imagery in Associative Learning and Memory." Psychological Review, 76, 241-263; (1971), Imagery and Verbal Processes. Holt, Rinehart & Winston: New York; (1986), Mental Representations: a dual coding approach. Oxford University Press: New York. A. Paivio & I. Begg, (1981), Psychology of Language. Prentice-Hall: Englewood Cliffs, NJ..
22. Philosophers may also have been put off by the unfashionably, and, it must be said, naively, empiricist nature of Paivio's views about (amongst other things) the sources of his representations' meaningfulness (i.e. intentionality). However, this matters much more to them than it does to him, and his views on it can be detached from the rest of the theory without significant loss, and without affecting the present argument.
23. Alexander Pope, (1711), "An Essay on Criticism." ln. 298. See also lines 424-9.
24. See N.J.T. Thomas, (1987), Psychological Theories of Perception, Imagination and Mental Representation, and Twentieth Century Philosophies of Science. Unpublished Doctoral Thesis, University of Leeds: Leeds U.K. (ASLIB Index to Theses, 37-iii, No. 37-4561), sec. I.C.1.
25. See N.J.T. Thomas, (1989), "Experience and theory as determinants of attitudes toward mental representation: The case of Knight Dunlap and the vanishing images of J.B. Watson." American Journal of Psychology, 102, 395-412.
26. This, of course, was known from antiquity - F.A. Yates, (1966), The Art of Memory. Routledge & Kegan Paul: London.
27. Needless to say, Paivio, op. cit., has elaborated the theory in considerably more detail. However, the elaborations are irrelevant for my present purposes, nor would I be anxious to defend all of them.
28. See A. Paivio, op. cit., and (1983), "The Empirical Case for Dual Coding." in J.C. Yuille (ed.) Imagery Memory and Cognition: essays in honor of Allan Paivio. Erlbaum: Hillsdale, NJ., and other articles in the same volume. J.T.E. Richardson, (1980) (Mental Imagery and Human Memory. Macmillan: London) gives an extensive review which comes down, on balance and somewhat unpersuasively (see R.A. Finke's (1981) review: "Imagery Mnemonics - Spatial and Structural Aspects." Contemporary Psychology 26, 610-611), against Dual Coding theory, but see below.
29. Richardson (1980), op. cit, chapter 5, reviews this evidence and concludes in favor of Dual Coding Theory here!
30. E.g.: R.S. Moyer, (1973), "Comparing Objects in Memory: evidence suggesting an internal psychophysics." Perception and Psychophysics, 13, 228-246; A. Paivio, (1975), "Perceptual Comparisons Through the Mind's Eye." Memory and Cognition, 3, 635-647; S.M. Kosslyn, G.L. Murphy, M.E. Bemesderfer & K.J. Feinstein, (1977), "Category and Continuum in Mental Comparisons." Journal of Experimental Psychology: General, 106, 341-375; A. Paivio, (1978a), "Comparisons of Mental Clocks." Journal of Experimental Psychology: Human Perception and Performance, 4, 61-71; A. Paivio, (1978b), "Mental Comparisons Involving Abstract Attributes." Memory and Cognition, 6, 199-208.
31. E.g.: L.R. Brooks, (1967), "The Suppression of Visualization by Reading." Quarterly Journal of Experimental Psychology, 19, 287-299; L.R. Brooks, (1968), "Spatial and Verbal Components of the Act of Recall." Canadian Journal of Psychology, 22, 349-368; G. Atwood, (1971), "An Experimental Study of Visual Imagination and Imagery." Cognitive Psychology, 2, 290-299; A.D. Baddeley, S. Grant, E. Wright & N. Thompson, (1975), "Imagery and Visual Working Memory." in P.M.A. Rabbit & S. Dornic (eds.), Attention and Performance: Vol. 5. Academic Press: London; W.H. Janssen, (1976a), On the Nature of Mental Imagery. Institute for Perception TNO; Soesterburg, Netherlands; W.H. Janssen, (1976b), "Selective Interference in Paired Associate and Free Recall Learning: Messing up the Image." Acta Psychologia, 40, 35-48; A.D. Baddeley & K. Lieberman, (1980), "Spatial Working Memory." in R.S. Nickerson (ed.), Attention and Performance: Vol.8. Erlbaum: Hillsdale, NJ.. See Thomas (1987), op. cit., sec. I.C.2 for discussion.
32. S.M. Kosslyn: (1975), "Information Representation in Visual Images." Cognitive Psychology, 7, 341-370; (1976a), "Can Imagery be Distinguished from Other Forms of Mental Representation? Evidence from Studies of Information Retrieval Times." Memory and Cognition, 4, 291-297; (1976b), "Using Imagery to Retrieve Semantic Information: a Developmental Study." Child Development, 47, 434-444. See Thomas, (1987), op. cit., sec. I.C.5 for discussion.
33. I mean the evidence for 'mental rotation' (R.N. Shepard & L.A. Cooper, (1982), Mental Images and Their Transformations. MIT Press: Cambridge, MA), 'mental scanning' and the like (S.M. Kosslyn, (1980), Image and Mind. Harvard University Press: Cambridge, MA; C. Bundersen & A. Larsen, (1975), "Visual Transformation of Size." Journal of Experimental Psychology: Human Perception and Performance, 1, 214-220; A. Larsen & C. Bundersen, (1978), "Size Scaling in Visual Pattern Recognition." Journal of Experimental Psychology: Human Perception and Performance, 4, 1-20), and for involvement of common mechanisms in imagery and in perceptual processes (R.A. Finke, (1980), "Levels of Equivalence in Imagery and Perception." Psychological Review, 87, 113-132; R.A. Finke, (1985), "Theories Relating Mental Imagery to Perception." Psychological Bulletin, 98, 236-259); M.J. Farah, (1988), "Is Visual Imagery Really Visual? Overlooked Evidence from Neuropsychology.", Psychological Review, 95, 307-317). For more general reviews see: Thomas (1987), op. cit.; R.A. Finke, (1989), Principles of Mental Imagery. MIT Press: Cambridge, MA; Morris & Hampson, (1983), op. cit.; S.M. Kosslyn, (1983), Ghosts in the Mind's Machine: creating and using images in the brain. Norton: New York.
34. Richardson, 1980, op.cit., and see below. Doubts about dual coding theory are usually expressed as doubts about imagery, but the "dual coding/common coding" dispute with which we are here concerned should be disentangled from the better known "analog/propositional" dispute. It is possible to combine the view that mental images are computationally instantiated as propositional descriptions with a version of dual coding theory (see: G.W. Baylor, (1973), A Treatise on the Mind's Eye. Unpublished Ph.D. thesis, Carnegie-Mellon University (U.M. 72-12,699); D. Kieras, (1978), "Beyond Pictures and Words: alternative information-processing models for imagery effects in verbal memory." Psychological Bulletin, 85, 532-544). Dual Coding theory was initially developed against the background of a tradition which saw memory as entirely verbal. The dual/common coding dispute became effectively transformed into one between Dual Coding theory and the hypothesis of a single, amodal, 'propositional' representational system with the publication of J.R. Anderson & G.H. Bower, (1973), Human Associative Memory. Winston/Wiley: Washington D.C./New York. It is true that, although it drew much inspiration from Baylor, the main opening salvo in the analog/propositional debate (Z.W. Pylyshyn, (1973), "What the Mind's Eye Tells the Mind's Brain: a critique of Mental Imagery." Psychological Bulletin, 80, 1-25) also directed itself mainly against Paivio. However, Pylyshyn soon found more appropriate targets in Shepard and, especially, Kosslyn.
35. Paivio, (1983), op. cit..
36. E.g. A. Paivio: (1975), "Imagery and Synchronic Thinking." Canadian Psychological Review, 16, 147-163; (1983), "The Mind's Eye in Arts and Science." Poetics, 12, 1-18.
37. Paivio & Begg, (1981), op.cit..
38. W. Kintsch, (1977), Memory and Cognition. Wiley: New York.
39. O.J. Flanagan jr., (1984), The Science of the Mind. MIT Press: Cambridge, MA.. Flanagan does not explicitly mention Paivio, but must have him (or some derivative view) in mind.
40. See Paivio, (1986), op. cit., p.58.
41. See Paivio, (1986), op. cit., p.73.
42. Hume in Selby-Bigge & Nidditch, (1978), op. cit., p.1.
43. A. Paivio, (1977), "Images, Propositions, and Knowledge." in J.M. Nicholas (ed.), Images, Perception and Knowledge. Reidel: Boston. Paivio (1986), op. cit., is openly critical of Kosslyn's, (1980), op. cit., "quasi-pictorial" theory of imagery.
44. For the rudiments of a non-pictorial account of imagery which yet does not collapse it into the linguistic or 'propositional' see U. Neisser, (1976), Cognition and Reality. Freeman: San Francisco. For more detail, critique and defense, arguments as to why this sort of theory is needed, and accounts and citations of other versions, see Thomas, (1987), op. cit.. On an early attempt at such a theory (and its somewhat unfortunate sequel) see Thomas, (1989), op. cit..
45. For the positive grounds, empirical and conceptual, why we should do this, and accounts of theories of imagery along these lines, see Thomas, (1987), op. cit.. On the weakness of the grounds for the Empiricist position see M.J. Morgan, (1977), Molyneux's Question: vision, touch and the philosophy of perception. Cambridge University Press: Cambridge. On the importance of non-visual aspects of our imagery experience see N. Newton, (1982), "Experience and Imagery." Southern Journal of Philosophy, 21, 475-487.
46. W.S. Hett (trans. & ed.), (1957), Aristotle VIII: On the Soul, Parva Naturalia, On Breath, Harvard University Press/Heinemann: Cambridge, MA/London. See: J.I. Beare, (1906), Greek Theories of Elementary Cognition: from Alcmaeon to Aristotle. Oxford University Press: Oxford; D.W.K. Modrak, (1987), Aristotle, the Power of Perception, University of Chicago Press: Chicago. Aristotle also used the 'wax impression' metaphor (De Memoria, 450a-b), however, and spoke of imagination as "a feeble sort of sensation" (Rhetorica, 1370a), so he can also be regarded as a forerunner of the Empiricist conception of imagery. M.C. Nussbaum, (1978), Aristotle's De Motu Animalium. Princeton University Press: Princeton, NJ (essay 5), argues that Aristotle's thought thus involves two, not entirely consistent, conceptions of imagination, and I fear we may still be living with the consequences of this today. See Thomas, (1987), op.cit., for additional discussion and secondary sources.
47. E.R. Harvey, (1975), The Inward Wits: Psychological Theory in the Middle Ages and the Renaissance, Warburg Institute, University of London: London; E. Clarke & K. Dewhurst, (1972), An Illustrated History of Brain Function, Sandford: Oxford; H. Caplan (trans. & ed.), (1930), Gianfrancesco Pico della Mirandola: "On the Imagination" (original Latin, c. 1500), Yale University Press: New Haven CT.. The notion survives in Descartes (1664, Treatise of Man.: T.S. Hall (trans. & ed.), Harvard University Press: Cambridge, MA, 1972, p.86) and into the 18th century (see Z. Mayne, (1728), Two Dissertations Concerning Sense, and the Imagination, with an Essay on Consciousness. Tonson: London, p.70).
48. Baylor, (1973), op. cit..
49. S.M. Kosslyn, K.J. Holyoak & C.S. Huffman, (1976), "A Processing Approach to the Dual Coding Hypothesis." Journal of Experimental Psychology: Human Learning and Memory, 2, 223-233.
50. Paivio, (1986), op. cit.. Paivio's allegiance to the term "code" may have encouraged such confusions. It is of little significance.
51. M. Marschark, C. Richman, J.C. Yuille & R.R. Hunt, (1987), "The Role of Imagery in Memory: on shared and distinctive information." Psychological Bulletin, 102, 28-41. Relevantly similar positions are taken by: J.R. Anderson, (1983), The Architecture of Cognition, Harvard University Press: Cambridge, MA; and Morris & Hampson, (1983), op. cit.. Interpretation is a little difficult here, as there is a good deal of ambiguity in the notion of 'propositional' representation. It may not always imply a Fodorean mentalese. The pioneering 'propositionalist' proposals of Anderson & Bower, 1973, op. cit., appear to envisage an English vocabulary, structured by an alternative, non-linear, syntax. Anderson's more recent position may well really be closer to that which I have ascribed to Kosslyn and to Baylor, and Baylor's may well be closer to Marscharck's. "Propositional representation" is, in any case, an oxymoron on the original, philosophers' sense of "proposition" (see R.M. Gale, 1967, "Propositions, Judgements, Sentences and Statements." in P. Edwards (ed.), The Encyclopedia of Philosophy: Vol.6 Macmillan/Free Press: London/New York, pp.494-505). Interested philosophers have thus usually preferred the expression "sentential representation", but this fails to bring out the very syntactical differences which the original distinction between 'propositional' and natural language codes was meant to suggest. Sloman's expression, "Fregean representation" (A. Sloman, 1978, The Computer Revolution in Philosophy. Harvester: Hassocks) may be the most satisfactory suggestion, but it has not caught on.
52. In rejecting such a notion, Paivio, (1986), op. cit. p.58, mistakenly in my view, equates it with the Aristotelian conception of the 'common sense'. I think a coherent Dual Coding theory requires something like the 'common sense', and it is notable that it is on this same page that Paivio seems to make some ill advised concessions toward 'six-code' theory.
53. Paivio, (1986), op. cit., p.73.
54. See Thomas, (1989), op. cit..
55. Paivio: (1971), op. cit.; (1986), op. cit.. C.f. U. Neisser: (1970), "Visual Imagery as Process and as Experience." in J.S. Antrobus (ed.) Cognition and Affect, Little, Brown: Boston MA; (1972), "Changing Conceptions of Imagery." in P.W. Sheehan (ed.), The Function and Nature of Imagery, Academic Press: New York. Dennett's (1978, op. cit., chap. 10) arguments would also seem to lend weight to such a view.
56. Richardson (1980), op. cit.. He justifies this stance through an appeal to Wittgensteinean criteria, but the argument is not persuasive, and I doubt that Wittgenstein would have approved.
57. Dual Coding theory thus seems to provide a basis for the Freudian distinction between conscious and unconscious mentation. The alternative distinction, between inherently non-conscious computational processes and conscious epiphenomena would not appear to parallel Freud's, although I have no doubt that a more suitable distinction could be built into a computational cognitive model in an ad hoc manner. I hold no particular brief for psychoanalytic theory, or any of its progeny, but the basic Freudian distinction is surely plausible.
58. He succeeded in persuading himself, and then others, to disbelieve in mental imagery, despite his experience (see Thomas, (1989), op. cit.).
59. Paivio (1971), op cit.. He also (1986, op. cit., p.57) countenances visual images of written words and haptic writing patterns in this connection. My own suspicion would be that vocal-kinaesthetic images of spoken words are generally the most important. It is quite conceivable that all or any of these alternative possible realizations of verbal representation may be operative to differing extents in different individuals at different times, and individuals may differ systematically in their employment of them. The empirical questions thus raised, however, have no bearing on our present argument.
60. Paivio, (1986), op. cit., pp.56-58. Since Paivio has always insisted that the two systems are richly interconnected and interacting, as we shall indeed need to insist, there appear to be no grounds here for a clear cut distinction between them (which our purposes will also require). On the other hand, if the distinction is allowed merely to be a matter of degree, then six-code theory cannot be decisively rejected, as no doubt there is more cohesion within each sensory mode than there is between modes, 'common sense' or no.
61. It can, of course, and often will, be used to represent weasels in general, rather than a particular weasel. This does not introduce any assymetry between verbal and imaginal representations as they are currently being conceived. An image of a weasel can be used to represent weasels too.
62. No doubt some exist: in site of neural realization, for example. Paivio, (1986), op. cit., and Paivio & Begg, (1981), op. cit., attempt to press the evidence on cerebral lateralization into service as support for dual coding theory. However, there is evidence of significant left hemisphere involvement in visual imaging, e.g.: M.J. Farah, (1984), "The Neurological Basis of Mental Imagery: a Componential Analysis." Cognition, 18, 245-272; S.M. Kosslyn, J.D. Holtzman, M.J. Farah & M.S. Gazzaniga, (1985), "A Computational Analysis of Mental Image Generation: Evidence from Functional Dissociations in Split-Brain Patients." Journal of Experimental Psychology: General, 114, 311-341; D.F. Marks, (1986), "The Neuropsychology of Imagery." in D.F. Marks (ed.), Theories of Image Formation. Brandon House: New York; G. Goldenberg, I. Podreka, K. Hoell & M. Steiner, (1986), "Changes of Cerebral Blood Flow Patterns Caused by Visual Imagery." in D.G. Russell, D.F. Marks & J.T.E. Richardson (eds.), Imagery 2. Human Performance Associates: Dunedin, New Zealand; S.M. Kosslyn, (1988), "Aspects of a Cognitive Neuroscience of Mental Imagery." Science, 240, 1621-1626; M.J. Farah, L.L. Weisberg, M. Monheit & F. Peronnet, (1989), "Brain Activity Underlying Mental Imagery: Event-related Potentials During Mental Image Generation." Journal of Cognitive Neuroscience, 1, 302-316; D.F. Marks, (1989), "On the Relationship Between Imagery, Body and Mind." in P.J. Hampson, D.F. Marks & J.T.E. Richardson (eds.), Imagery: Current Developments. Routledge, Chapman & Hall: London. On the other hand, this evidence will demand to be reinterpreted in the light of the major thesis of this paper, which entails that both consciousness of and manipulation of image representations necessarily involves verbal processes. An example of a loss of conscious imagery (in all modes) after a left hemisphere lesion is, indeed, suggested by the investigators to be best attributable to a disconnection between the imagery and verbal systems: A. Basso, E. Bisiach & C. Luzzatti, (1980), "Loss of Mental Imagery: a case study." Neuropsychologia, 18, 435-442.
63. Either 'intrinsically', in some way, or derivatively from something non-representational, such as 'use'.
64. This would presumably be the position of Fodor, and of Marschark et. al., op. cit., and those who think like them.
65. This is, in fact, essentially Paivio's view (1971, op. cit., 1986, op. cit.), and was the standard philosophical position before the present century. It too goes back to Aristotle (De Anima, 420b; De Interpretatione, 16a) and is thus entangled with the Aristotelian confusions mentioned above. Of course, it is in extremely bad odor amongst contemporary philosophers.
66. G. Kaufmann (1980, Imagery, Language and Cognition. Universitetsforlaget: Oslo; 1986, "The Conceptual Basis of Cognitive Imagery Models: a critique and a theory." in D.F. Marks (ed.), Theories of Image Formation. Brandon House: New York) combines such a view with a Wittgensteinian theory of linguistic meaning.
67. Dennett, (1978), op. cit., p.101, 122.
68. His implicit solution to the problem of the consciousness of representations would seem to be the solution by fiat mentioned above. It is just of the nature of Humean ideas to be consciously experienced (by nobody). Gilbert Ryle (1949, The Concept of Mind. Hutchinson: London) is, rightly in my view, scathing about any such notion of "self-intimating" mental contents. However, rejecting a bad theory of a phenomenon does not make the phenomenon go away. Nor does the unsubstantiated insinuation that only philosophers and those foolish enough to listen to them ever experience it. Francis Galton, (1883, Inquiries into Human Faculty and its Development. Macmillan: London) found people in "general society", including children, noticably more willing to admit to experiencing imagery than were intellectuals (see Thomas, 1989, op. cit.).
69. I have no particular commitment to the view that the cognitive processes within each of the representational systems must be merely (or at all) associative. However, I have no other mechanisms to offer, and I think association may be enough. Anyway, it is enough to illustrate the rest of my argument. Perhaps, in order to account for syntactic structure, the verbal system needs to be more than associative. Indeed, following up this Chomskyan insight has been one of the major inspirations behind cognitivism. However, as I have already tried to argue, purely syntactic processes are not enough to account for thinking. Just as thought is not merely associative trains of images, neither is it just a sequence of merely syntactically-well-formed sentences. But, in any case, I am not entirely convinced that grammatical structure has to arise from processes within the verbal system, as conceived here. I would like to throw out the suggestion that it may depend on the fitting of verbal material into structural frameworks encoded in the imagery system (please recall that images need be neither visual nor actually conscious). This could be a recursive process, building words into phrases, phrases into sentences, and so on.
70. The term is from Ian Hacking, (1975), Why Does Language Matter to Philosophy? Cambridge University Press: Cambridge.
71. Recall that we have already rejected the notion of the mental image, even its visusl aspect, as an internal picture. Imagination is not like looking at photographs, and a fortoriori not like looking at still photographs.
72. Janssen, (1976a), op. cit., p.25; T.M. Graefe & M.J. Watkins, (1980), "Picture Rehearsal: an effect of selectively attending to pictures no longer in view." Journal of Experimental Psychology: Human Learning and Memory, 6, 156-162; M.J. Watkins, Z.F. Peynircioglu & D.J. Brems, (1984), "Pictorial Rehearsal." Memory and Cognition, 6, 553-557; Finke, (1989), op. cit., p.28.
73. E.g. De Anima, 434a. He had his doubts, however, about ants, bees and grubs (De Anima, 428a) - or, according to H. Lawson-Tancred (1986, (trans. & ed.), Aristotle: De Anima (On the Soul). Penguin: Harmondsworth) just about grubs.
74. See Descartes in Sutcliffe, (1968), op. cit., pp.74f.
75. The same point would apply to any complete, but bare, account of neural functioning we might obtain.
76. I think that they will only ever be found in robots, not 'sessile' computers or neural nets, but that is another story; that is to do with intentionality.
77. On the one hand, an account of the program and architecture, however generalized away from incidental details of realization would not amount to a psychological theory (even of robot psychology); on the other hand, having the right architecture and running program would not be sufficient for having a mind. Nor, I think, is having a functional brain sufficient for mentality. Suitable interfacing with the external world is also necessary. Without this there would be no mental content (images or language), and without content there is no mind.
Return to Home Page:
Imagination, Mental Imagery, Consciousness, Cognition:
Science, Philosophy & History.