Finally added an about page: Assumptions Searle is right when he says symbols are semantically vacant (have no meaning in themselves, are not about anything per se), (Hence) symbols emitted by digital sensors carry with them no indication of what was sensed, Digital computers of the sort we have today will one day perceive, think, […]
Posts in category Chinese room argument
Why is the Chinese room not a compute...
The Chinese room is supposed to be a computer trying to be a brain (that understands Chinese). Searle argues that because of the fundamental nature of the computer, as presented by his Chinese room thought experiment, no computer could think. There are two core conclusions in this argument, and one can get to the second […]
The Turing test tests language compre...
The common AI view says that if a machine passes the Turing test then it can reasonably be said to have a human-like intelligence. But it might pay to be more specific. The Turing test The Turing test (TT) described by Turing in 1950 tests language comprehension. An interrogator asks hidden contestants questions. The interrogator […]
The Turing test is asymmetrical
Starting from the position that symbols are tokenized shapes that have meanings (i.e., symbols as per the Chinese room argument), the assumption is that computers do not process symbols. What implications does this have for the Turing test? Typical description of the Turing test A man, the Interrogator, sits in a room at a desk […]
Chinese room argument (alleged) rebut...
I’ve just published my page on John R. Searle’s Chinese room argument (CRA). A gazillion enthusiasts have offered (alleged) rebuttals of the CRA. Yet given so much commentary partly or fully for or against, the jury on the CRA is still out. The different sides (mainly AI and those-other-than-AI) have retreated to entrenched positions. A […]
Why doesn’t the Chinese room le...
This, below, is my latest post on philosophy.stackexchange. I’m still trying to generate a conversation about why there is no structure in the Chinese room. The lack of structural elements in the room’s ontology is the key error, in my view, in Searle’s picture of the Chinese room. I just can’t see how John […]
Do relationships rebut the Chinese ro...
Searle says syntax is neither sufficient for nor constitutive of semantics, all a computer gets (eg from sensors) is syntax (tokenised shapes) therefore computers will never understand the world. Searle: “There is no way to get from syntax to semantics” (Minds, Brains and Science, p34); “digital computers insofar as they are computers have, by definition, […]
Is the Chinese room a computer?
More from philosophy.stachexchange.com. – John Forkosh commented on my question, Does it make sense to define a computer as a symbol-manipulating device? My response to his comment was: Thanks John. When you say: 1. SYMBOLS. “…for concreteness, let’s please do away with this unnecessarily vague “voltage level” terminology, which you’ve used here and in preceding comments […]
Does the Chinese room need 2 symbol-p...
I’ve been posting questions on philosophy.stackexchange.com about the Chinese room. The replies have been great. One of my questions was: Does it make sense to define a computer as a symbol-manipulating device? A comment by Phil_132 suggested replacing the program with wiring. (The Chinese room is supposed to be an electronic computer trying to be […]