Philosophical Questions - 2020-01-12

This is not just about technology.

As one who has spent a full career in "Information Technology", I can testify that "IT" is directly or indirectly about language. New languages and new development environments come and go, but they all boil down to the interpretation of text.

As Mitchell's book teaches us, even "deep learning" models that "recognize" faces ultimately depend on masses of text - even "languages" whose vocabulary is nothing but ones and zeros.

This touches on a problem that has been recognized for thousands of years, most notably by Buddha and Lao Tsu (founders of Buddhism and Taoism respectively) and recently by Ludwig Wittgenstein, that "language", no matter how clever or deep, only stands for reality. It is about reality but it is not reality itself. You may say that this points to a fundamental problem that we may share with machines - "software" can never be more than a "description" of reality. However, unlike machines, our knowledge of the world relies on the fact that we are flesh and blood participants in the world. Billions of years of evolution have made us objects in the world that react and learn in ways that go far beyond knowledge. To cite a simple example, our hearts beat perfectly well for millions of years before we discovered the circulation of blood. The same goes for our "understanding" of the world, which seems to mostly take place "under the surface".

Mitchell gives us a snapshot history of the futile effort to make a list of everything a human "knows" by "common sense". The effort mostly shows the sad fact that computer programmers should read more philosophy. To mention just one example, Hofstader's "Surfaces and Essences" sketches out the way that virtually infinite bits of "knowledge" can be spun out from the sparse base of human experience by analogy - the engine of thought. Computers need to be "taught" to recognize analogies, such as whether there is a car in the picture. They are terrible at it. And what you get, in the end, is a clunky attempt of the computer to "learn" from thousands of people to "recognize" categories that it is utterly unable to recognize on its own.

It's even worse than that. As philosophers endlessly point out, human categories are somewhat arbitrary - suited for the unique situation that humans find themselves in the world. A chair is only a chair if you are a human that knows about sitting. 

Comments

Popular posts from this blog

Facebook and Bing - A Killer Combination

A Process ...

Warp Speed Generative AI