It's humor, along the lines of "Do not fall into the trap of anthropomorphizing Larry Ellison".
My point is that humans are not quite as special as we like to think. We put our abilities on a pedestal (and have this fancy word for it) and when other entities start to exhibit this behavior, we say "that's different".
The obvious retort to Searle is that "the room understands chinese". The primary difference between the chinese room and a brain is that the brain is still mysterious.
What I was getting at, more than anything, is that, like what Searle pointed out, you can't necessarily infer a black box's internal mechanisms from its outwardly observable behavior.
Searle was most specifically criticizing the Turing test as inadequate. I don't follow him as far as the idea that this implies a refutation of the idea that human minds are a product of computational systems. To me that idea relies on an untenable definition of "computation". But the weaker conclusion that you can't simply assume that two completely different systems that exhibit the same observable behavior must use the same mechanism to achieve that behavior does strike me as compelling.
Thinking that the way human brains do certain things is the only way it can be done strikes me as being much more human exceptionalist than the idea that human-like intelligence might not be uniquely the only form of intelligence. Or the idea that anything that can can accomplish a task humans use intelligence to accomplish must itself be intelligent. Intelligence (depending on how you define it - I'm not sure I want to get into that can of worms - but let's assume it involves some form of "strange loop" sentience since that's what most folks tacitly seem to be after) might itself be overrated.
> But the weaker conclusion that you can't simply assume that two completely different systems that exhibit the same observable behavior must use the same mechanism to achieve that behavior does strike me as compelling.
Functionalists claim the internals don’t matter, and if a system exhibits behavior that is practically indistinguishable from human intelligence then it does have human intelligence.
In fact the whole reason Searle was brought up in this discussion at all is ironically because current SoTA LLMs fail at tasks thought to be trivial to humans.
My point is that humans are not quite as special as we like to think. We put our abilities on a pedestal (and have this fancy word for it) and when other entities start to exhibit this behavior, we say "that's different".
The obvious retort to Searle is that "the room understands chinese". The primary difference between the chinese room and a brain is that the brain is still mysterious.