Page 97 - Encyclopedia of Philosophy of Language
P. 97

 scheme to encode the relevant information. Depend- ing on the mapmaker's skill, the map may or may not be easy to use. Cognitive scientists have to assess the purpose of a piece of mental apparatus, to postulate a representational scheme that, together with pro- cesses to operate on it, satisfies that purpose. Then they must try to find evidence that that scheme is used. This process is a complex one, not only because cognitive scientists cannot look and see what the elements of the representational scheme are, but also because they have to make inferences about processes as well as representations. The representational scheme needed to perform a task depends on what processes act upon the representations allowed by the scheme.
The philosopher Fred Dretske (1988) contrasts mentalrepresentations withmapsbyclassifyingthem as a type, indeed the most important type, of natural representation system. He claims that natural rep- resentation systems are the source of intentionality in the world. Intentionality is the 'aboutness' which philosophers take to be a defining characteristic of mental phenomena. A map is 'about' the terrain it represents, but only derivatively. Its aboutness derives from the fact that people interpret it as being about a certain part of the world. The aboutness of mental representations is not derivative, and it is for this reason that mental representations are so important and so difficult to understand. It is also for this reason that the study of maps can only take us so far in understanding the concept of mental representation.
Dretske analyzes the notion of representation in terms of indication. One thing indicates another if its occurrence provides information about what it indi- cates. Because of the rich correlational structure of the world, there are many instances of indication. A bear's paw prints in the snow indicate that a bear has passed this way. The ringing of a door bell indicates that someone is at the door (and also, for example, that current is flowing in the door bell's electric circuit). Certain patterns of activity in a person's vis- ual cortex indicate that they have seen a chair (to anyone or anything that can register them). For Dretske, there is no misindication. If signs are mis- interpreted, they are being used as representations.
An indicator becomes a representation if it is given the function of indicating the state of something else. Now misrepresentation is possible. If a car's fuel gauge jams, it does not really indicate that the tank is full. But since it has been given the function of indi- cating how much fuel is in the tank, it misrepresents how full it is. Misrepresentation can be a nuisance, or worse. However, the possibility of misrepresentation goes hand in hand with a very useful property of representational schemes: their elements can be recombined at will. Maps of imaginary countries can be drawn. I can mentally represent not only what the world is like, but how I want it to be, how I think you
falsely believe it to be, and so on. Thus, although natural systems of representation derive from natural indicators of things in the real world, particular rep- resentations can be decoupled from the world. They need not be caused by what they represent.
2. Neural Substrates and Connectionism
Cognitive scientists assume that the mind is a mech- anism, in the very general Turing machine sense, and that its physical substrate is the brain. Thus, every mental state is associated with a corresponding brain state. If that mental state is a complex representational one, each element of the representation is associated with some aspect of that brain state. For most of our cognitive abilities, no more can be said at present. It is not even known whether equivalent mental states are always associated with the same brain state. And even when a good deal is known about the underlying neural substrate—as in the case of low-level visual processing, for example—it has been argued (e.g., by David Marr 1982) that questions about rep- resentational schemes and the processes that act on them can often be addressed independently of ques- tions about neural substrates, via an information pro- cessing analysis of the relevant ability.
It is, of course, possible to take a purely func- tionalist approach to cognition in general and to men- tal representations in particular. Functionalism holds that the correct, or best, theory of a particular mental ability is the one that best explains the psychological data. The mental representations people use are the ones postulated in that theory. On one interpretation this view is vacuous, because the decision about which explanation is best may be influenced by non- psychological factors, such as compatibility with what is known about brain structure. On another interpret- ation functionalism is a substantive, though almost certainly false, doctrine. On this interpretation, con- siderations about brain structure are irrelevant to choosing the best psychological theory.
Since about 1980 the substantive version of func- tionalism has been challenged by people working in the parallel distributed processing (PDF) or con- nectionist framework. Connectionists attempt to reproduce human behavior using networks of simple processing elements whose properties resemble those of brain cells or clusters of them. The behavior of a connectionist machine may suggest that it isfollowing a set of rules (couched in a language of thought). However, nothing in the machine corresponds to the rules in the way that a piece of code in a traditional computer model of the mind does.
The correct interpretation of connectionist models has been a matter of intense debate. It is known that connectionist machines can simulate traditional serial computers (von Neumann machines), just as von Neu- mann machines can simulate connectionist machines. However, connectionist machines as they are used in
Representation, Mental
75























































































   95   96   97   98   99