John Searle, one of the "grey beards" in the AI field.

This is a talk by John Searle, one of the "grey beards" in the AI field. It is remarkable that this was delivered in 2015 and is now charmingly obsolete. "Nobody has begun to think ..." His reliance on the idea that computers are all Turing machines is dated. He is over-confident in his answers to the "young Turks," who turn out to be building precisely what he thinks is impossible. Like other philosophers, he relies on a "101" understanding of a computer. For example, he highlights the idea that you need to be conscious to frame a question.

Something it "feels like" is the essence - such as Damasio.

Philosophy ...

Inventor of some ideas popularized by Hofstadter

The crucial element of "semantics" over "syntax" and the claim that computers can never do anything more than play with syntax (language). I would say that this misunderstands the nature of language, as Hofstadter describes language as an analogy engine.

Some hints about my notation ideas - "as if" notation for what a machine does 

For most purposes, it doesn't matter except when we attribute consciousness which is not itself observable ... intrinsic.

I think it's complex motivated reasoning - about definitions and words - typical of philosophy in general. Examples of dogs' consciousness rely on this. A background preposition.

He notes that our "apparatus" for discussing these issues is obsolete. Indeed! Useful to note that humans can be (and once were) described as computers. A lovely example of a falling object computing acceleration.

His arguments are similar to the "God of the gaps."

Computation is a matter of interpretation -- the eye of the beholder.

Kurzweil's comments unravel the problems with language. Lovely answer reflects the difference between the causal framework of the brain versus the way the brain's causal network works.

NOTES ON "MIND, LANGUAGE AND SOCIETY"

Belongs to the school of "external realism."

"Background presuppositions."

The only fact that could make them into a mental state is that they are in principle, capable of causing that state in a conscious form.

Searle, John R. Mind, Language And Society (Masterminds) (p. 86). Basic Books. Kindle Edition. 

How about 4321? Is this, in principle, capable of causing a state in a conscious mind by, say, being a result of a calculation? The same could be said of the sound of a mosquito flying, which could potentially be "noticed" and cause a state of mind. I don't think he's handling "potentialities" correctly, and this goes very deep to the "potential" of a neuron to fire (1,000 brains). Physical "Potential" plays a huge role in consciousness. The entire environment (umwelt) is full of "potential" entities. There are an infinite number of these entities. To some extent, the "contents" of the "mind" are constructed "on the fly."

JS talks about mapping mental states to things in the world but doesn't use the mathematical concept of mapping. A shame. This leads to an unnecessarily complicated talk about "conditions of satisfaction."

It is unclear what JS means by a mental "state,", particularly an "international" state. What are other states? How does one state follow another?

I got totally bogged down and fogged out by his treatment of society, language, and meaning. I suppose these are not sufficiently mysterious to me. I was disappointed that he did not tell me something about how groups of people exhibit (or not) qualities of consciousness.

We can summarize these points in the following propositions. 

  • Consciousness consists of inner, qualitative, subjective states and processes. It has therefore a first-person ontology 
  • Because it has a first-person ontology, consciousness cannot be reduced to third-person phenomena in the way that is typical of other natural phenomena such as heat, liquidity, or solidity. 
  • Consciousness is, above all, a biological phenomenon. Conscious processes are biological processes. 
  • Conscious processes are caused by lower-level neuronal processes in the brain. 
  • Consciousness consists of higher-level processes realized in the structure of the brain. 
  • There is, as far as we know, no reason in principle why we could not build an artificial brain that also causes and realizes consciousness. 

But that is it.

Searle, John R. Mind, Language And Society (Masterminds) (p. 53). Basic Books. Kindle Edition. 

If we accept this description of consciousness, a group cannot be conscious. We are left with a suspicion that ideas like intentionality could apply to a group. He sees elements essential to consciousness, such as memory and continuity, that may apply to the group.

Searle's definition of consciousness seems limited to our own kind of experience, not the (perhaps) wider idea of sentience. Here is an aggressive idea of what sentience is and where we can find it.

In any case, it seems like Damasio, rather than Searle, is our guide to the "hive mind."

Comments

Popular posts from this blog

Facebook and Bing - A Killer Combination

A Process ...

Warp Speed Generative AI