Minds and Machines
Wednesday, 02-22-12: Robot Intentionality III: Dretske's Response
- Fred Dretske, "If You Can't Make One, You Don't Know How It Works"
- Fred Dretske, "Minds, Machines, and Money: What Really Explains Behavior"
- John Searle, "Is the Brain a Digital Computer?" (from Monday's lecture)
We began today by revisiting Boden's reply to Searle's Chinese Room Thought Experiment to set the stage for Searle's subsequent analysis of the brain. Even if we do not endorse Boden's argument that the Chinese Room qua Robot exhibits some minimal intentionality, it harder to dismiss her challenge to Searle's position that the brain is an organ which has evolved to realize intentional states. Thus, for Searle, biochemical processes underwrite intentionality in a way that the rule-governed manipulation of strings of symbols cannot.
Boden's challenge to this view is that what is important about the brain is not its biochemical processes per se but what those biochemical processes do in transmitting information. In a manner of speaking, Searle recapitulates the Type-Physicalist's error (at least, according to Putnam's Multiple Realizability Argument). If intentionality is the mark of the mental, as Brentano reminds us, then restricting intentional states to our peculiar biochemical processes restricts minds to only those things that enjoy similar composition.
Searle, naturally, wants to meet the charge of excessive anthropocentrism. To do this he carefully clarifies and defends his position, but see our notes for more on his argument. The broad outline of his argument is that syntax is not intrinsic to physics, it is only read into physics by observers who confer at most derived intentionality by their own original intentionality. Yet if syntax is not intrinsic to physics, syntax enjoys no causal powers beyond those rendered by observers. As a result, the rule-governed manipulation of strings of symbols, in whatever physical way it is realized, cannot bear the causal relationships it must to exhibit original intentionality. Viewing the brain computationally, then, is only possible if one freely commits the homunculus fallacy--in effect begging the question by hypothesizing intentionality-rich sub-minds. Further, while we can view the brain as a computer in the same sense in which nearly any physical process can be harnessed to suit computational needs, it is not intrinsically an information processor, which it would have to be to cash the check machine functionalism has written.
Having spelled out Searle's response, we next turned to the task of laying out the groundwork for understanding Dretske's answer to the Chinese Room Thought Experiment. In particular, we distinguished between 'intention'--as in, acting with a goal or purpose--'intentionality'--as in, the aboutness or directedness sentences and mental states enjoy--and 'intensionality', which simply refers to the failure of substitutivity of co-referring terms or co-extensional predicates salve veritate. Granted, intensionality-with-an-s is a technical notion. We will revisit the point next time and learn why Dretske thinks intentionality-with-a-t is actually pretty cheap--about $1.95 at the hardware store.
Let me close by saying that I'm increasingly disappointed in attendance. To see more on this, please review the synopsis from last time. Let me just say at this point that the correlation between students who fail this course and students who miss a lot of class is strikingly high--just as high, indeed, as the correlation between students who pass the class and students who attend regularly. Maybe I should take attendance and give points or something. My usual stance is that you're all adults and are fully capable of making adult decisions about attending.
I begin to wonder if I'm mistaken in thinking this.