Site icon gordsellar.com

Logic, AI, and Instinct

Originally this post was going to be a lot longer, but I forgot where I was going to go with it, and it’s mouldered in teh Drafts box for many, many moons, so I’ve tied up the knots and I’m dumping it out on the porch for the neighbours to giggle at.


I’ve met a few Dr. Spocks in my time, and this whole thing about being purely logical, it’s just an exaggeration. Nobody’s purely logical. Logic cannot function without intuition, which is the whole lesson of syllogistic logic:

Every man is mortal.
I am a man.
Therefore I am mortal.

It’s all very straightforward, very logical, until you stop and ask yourself, “How do I know what a man is, or what this word man actually means?” Alan Turing tried to give us a shortcut to one kind of (gender inclusive) answer — if it convincingly seems like a person to other people, it can quite practically be considered one. Actually, I’m not sure that’s exactly what Turing’s test’s final verdict would be, but it’s an interesting way of looking at the problem.

It’s interesting because, quite certainly, the day will come when we will demonstrate a technological construct — a pseudo-AI — that is so complicated that it has been programmed to fool us into thinking it is a person, without it actually having identity or thoughts or other things which we all know from personal experience are hallmarks of personhood. It will have an apparent personality, but it will not believe itself a person, and perhaps, asked the right Easter Egg question, it would spill the beans. That would be the ultimate Turing test — an intelligence test for us. We pass if we are clever enough to figure out the artificiality of a sufficiently complex construct. OR maybe we pass if we’re smart enough to create something that reliably fools us.

Maybe, some other day, much later, we’ll even construct something that actually experiences identity, emotion, quales, the whole shebang. In SF, especially older or more popularized (debased) SF, we usually are drawn back to the notion that something that does not feel “emotions” cannot be considered “human” or “a person”. But really, I think there’s something more basic that we’ll need to bridge first, and I think that once we bridge that, emotions will be much more like child’s play. The crucial thing I think we’ll need to handle is an impetus to connect logical processing with the world — a stake in things and an ability to act upon it. Give a computer hands and a desire to stay booted up, and you’ll be fighting to unplug it… and that’s when interesting things will perhaps begin to happen.

You can teach a computer that tigers are cats that have spots. You can probably even teach a computer some kind of deductive process that allows it to pick out a tiger. But what you cannot do is make a computer care that it has picked out a tiger. When humans see a tiger, they have an impetus, a drive, to do something with that information. When they see a tiger, they FLEE! But a computer has no survival instinct, and really doesn’t need one. The computer can stop processing. It has no reason not to. When we stop processing, our enjoyments cease. We don’t like that. We want to keep processing reality. Tigers, they like to process us as meals. We don’t like that. There’s a clash of inclinations there, and that’s the spur that drives all of life into the complexities and wonders that seem never to cease… unless someday they will cease, collectively.

Of course, there’s no reason that all of this — stake and impetus and so on — can’t be emulated. That isn’t my point. It’s just that instinct is part of the bedrock of consciousness, personality, and how we function as physical beings. I had a number of talks with a friend of mine in Jeonju about this, and I have to confess I couldn’t understand her point of view at all — that nurture has fully overtaken and blotted out nature, that there is no relevant, unrestrained or unchanneled shred of instinct remaining in the modern human being.

To which I have to ask, for example, how do we just know that one tiger and another tiger are two of a kind? Because we never learn that, we just understand it naturally. More than instinct, this is built-in cognitive programming. To deny hardwired knowledge, and the presence of instinct, in humans is little more than a retreat into romanticism and fantasy, and the worst kind, too — the kind that denies what’s obviously there, and thus fails to celebrate us for what we truly are.

Exit mobile version