Damasio and the Self

Pop fears about the AI apocalypse often feature the machine suddenly and inexplicably "coming to life." The "lights go on," and all of a sudden, the machine supposedly feels like us, has a consciousness like us and (therefore) starts to exhibit all of our worst attributes. In particular, it "wants" to live and suddenly "sees" us as a threat.

All of this plays on thousands of years of confusion about what it is to be human in the first place. What is it to "feel alive"? What is consciousness? What is "self"?

I think that thousands of years of superstition and religious "theology" shed no light at all on this question. If you are one of those who are looking for a detailed treatment of the "soul" as distinct from the body, I wish you good luck. As for me, I see no reason to invent a separate realm of reality to house my mind - even as I admit that it is difficult to say just exactly how a slab of meat between my ears creates the feeling we all know as "being here." That said, there is a lot we can say about these matters. I tend to listen most carefully to actual brain scientists before the philosophers.

My favorite brain scientist is Antonio Damasio, and the best way to get into his way of thinking is his most recent book (at the time of writing): the "Strange Order of Things." I commend this book to the reader's attention. What follows is probably a garbled account of his ideas. He has a lot to say about the issue at hand:

  • The "feeling" of "being here" is, at the very least, a feeling. What is a feeling? In short, to have a feeling the organism must have an internal model of itself, plus a model of the outside world. The organism need not be very complicated to meet this threshold. A feeling abstracts raw sensation and connects sensation to action in an indirect way. An organism that responds directly to sensation (say, sunlight) without any intermediation won't have feelings. For our purposes, we note that "feelings" are a very ancient attribute of our branch of the tree of life. It is conceivable that a machine may have a model of itself (although this is rarely the case in state of the art AI) but it is debatable whether this would produce "feelings" of any kind, let alone the feeling of "being here". The problem with machines is that they lack a billion years of evolutionary experience as organisms in the world. Can this be "taught"?
  • A popular survival strategy of living this is to exist in nested hierarchies. Beehives, schools of fish and the multi-cellular organism are examples. The behavior of the organism tends to support the survival of the individual organism, but also the cells making up the organism's body and the survival of the "herd". In fact, many organisms, including humans, are rarely found in nature by themselves and are not equipped to live for long by their own wits alone. If we watch a flock of birds, a school of fish or a swarm of bees, we are struck by the sense that the flock itself is alive in some sense. Working from the bottom up, we are struck by our sense that we can look after the health of the billions of cells that make up our body by actions that make no sense at the cellular level. It would seem odd to think that this setup is somehow unique to humans. What we perceive as a "self" is a conglomerate that exists far below our awareness. In fact, one may think of our brains as presenting the most simplified version of our true nature unless some kind of emergency requires action at the "meta" level.
  • For me, Damasio's ideas challenge the idea that our experience of the "self" is misleading. We communicate with the group in ways that show us only "our end" of the communication, missing the way that all these communications add up to very definite group behavior. 
Many philosophers seem to insist that the "self" is either a phenomenon of the body or something on another plane entirely. Neither is true. Our perception of self easily transfers from our own body to the social situation we find ourselves in. It is perhaps a linguistic accident that we regard "I" as more fundamental than "we." It is only when we are near death do we perceive that our organs and cells are shutting down on their own schedule. The current rule is, when your brain is dead, you are dead. We rarely consider the complex system of cells that keep us alive under "normal circumstances" by keeping themselves alive. My favorite "meditation" on this subject is to think of my brain cells (neurons) as not that much different from any single-celled animal, except that they have found a way to survive comfortably in my head, producing the ideas that I put on this page as a byproduct of their metabolism. What do these cells do to justify the expense of their upkeep? They have feelings. They respond to the stimulus by sending a signal. This is not mere "wiring". There is a complex, indirect model that specifies the details of this reaction. The model - the "program" - is written in the DNA of every cell.

To be a "self", I require a body. But I need more than a body. I need a body that fits into a specific environment which includes other "bodies". Thinking of "mind" as something that happens in my head is not wrong. It's just incomplete. I have a lot more to say on this subject in a previous blog.

So how do we endow a machine with a "mind"? Simple. We find a way for existing human minds to participate in a structured, meaningful way to create a "higher" mind - a "supermind". This is nothing new and it has nothing to do with "technology". It's the way human society has always worked. In fact, it is the reason anything like human society exists.

Comments

Popular posts from this blog

Facebook and Bing - A Killer Combination

A Process ...

Warp Speed Generative AI