TheGrammarBolshevik wrote:(If the ridiculous interpretive shenanigans here hold, then we should be able to say that, since a computer only runs programs, running programs is sufficient for anything that any computer can do. And since every computer can run programs, every computer has the sufficient conditions for being able to do all of those things.)
This is true and false, true for the computer, false for the user. Which is the problem with the example above. It's about point of view.
As a thought experiment the Turing Machine illuminates the workings of the CPU. Since by definition, all aspects of the Turing Machine are finite, there is no ambiguity in the CPU. The CPU doesn't play Crysis, the CPU does Boolean operations, and that's all it does.
This is the Problem with Searle's experiment. Where he proposes to look, the all he can see are Boolean operations, that is the Rule Book. But this is a finite system. It is absolutely deterministic. The CPU, the operations it does, everything that happens in a digital computer, is absolutely explicit. The implication of this is that inside this process there is no error. All conditions are true or false. This is acknowledged in the truism of GIGO. Garbage in garbage out. A digital computer will do exactly what it is told to do. No more no less. So Searle can never see the program because the program is not finite. The program is a representation of the programmer. It can't be finite because the programmer isn't finite. The only time the program can return a reflection of the programmers understanding and therefore the programs understanding is in the output. Outside the CPU. In the Chinese Room experiment the only person who can judge the meaning happening inside the Chinese Room is someone not in it.QED
As a practical example of the above there is a area of expertise in mathematics called Numerical Analysis. One of the things it does is to examine the way this transition between [finite] and [not finite] introduces errors. A CPU can only perform integer operations, for instance. All other operations involve differing degrees of error. It also demonstrates why errors can be so difficult to find, all errors are errors of logic. Imprecise understanding of of what you are telling the computer to do. Feel free to hammer on this.