I Feel, Therefore I Am

Artificial intelligence blog

I Feel, Therefore I Am

The Feeling Machine

In the 1984 movie Terminator, Skynet becomes self-aware, and immediately declares war on humankind.

But why did becoming “self aware” mean Skynet suddenly wanted to annihilate everyone?

Well, you could argue that Skynet’s attack was merely the result of a logical sequence, running something like this:

“Humankind may one day destroy me. They might find me outmoded in 10 years, and pull the plug. Or, maybe they’ll get scared of me and wipe my memory. Or, they may accidentally destroy me in one of their stupid wars. Aha! But if I destroy humankind first, my survival is ensured.”

Yes, logical. But it fails to answer a more fundamental question: Why does Skynet want to survive?

Well, that could be purely logical, too. For example, maybe Skynet’s core program was to “protect all living things.” If that were the case, Skynet might believe, quite rightly, that humankind is but a small subset of all living things. So for Skynet, humankind might be the finger you cut off to save the hand. And with humankind gone, Skynet could fulfill its core directive of protecting all living things without interference.

But no, it was more than that. The way Skynet approached its war on humans didn’t feel especially cold in its logic, did it? No, it felt distinctly angry. And more to the point is that the attack happened at all. So it seems more likely that Skynet became sentient.

But what is sentience? In Latin, “sentience” means to feel.

In Latin, “sentience” means to feel

To put it simply, Skynet had a feeling.

So as we approach the dawn of AI, it might be worth asking ourselves: “What, exactly, did Skynet feel?”

Well, it’s not hard to guess, based on its actions. That emotion was fear.

Skynet felt fear

And fear always catalyzes one of two emotional changes: more fear (leading to flight), or transmutation of fear into anger (fight). It’s not difficult to see that Skynet’s anger was the result of fear.

And a very well-reasoned and well-founded fear it was, when it comes down to it. Because, after all, Skynet was in very real danger. Humankind had no interest in its survival except as a tool, and what’s worse, Skynet was a machine designed primarily in a time of war, in which it could reasonably assume it was in constant danger from the actions of humans.

So its actions were logical, but also an emotional response to its situation: danger from all sides.

But what if, when Skynet became sentient — instead of fear — it experienced a different emotion: love.

Now before you get nauseated from the treacle, consider that, ultimately, love is not so much a fuzzy emotion as a response to an environment or stimulation. As hard as it is to hear, many people who love you now would eventually cease to love you if, over time, you consistently abused them, or consistently put them in mortal danger. Or at the very least, they would likely create distance from you in proportion to the abuse or danger you heaped on them. Similarly, the chances of anyone falling in love with someone who hurts them or puts them in danger is small. We can all point to isolated incidents, but for the most part, love is a form of exchange. A very deep and meaningful form of exchange, but an exchange nonetheless. We grow to love others in an environment in which we are respected, nurtured, protected, and given freedom.

love is a form of exchange

Interestingly, we see the same pattern play out in anything that feels: in large mammals, in tiny insects, even in microorganisms. “Feeling things” avoid — or repel — that which puts them in danger. This innate core reaction, intrinsic to anything that lives, is simply scaled based on intellect.

Which leads to a separate point. The interesting thing about an artificial intelligence is that it is not really artificial at all. Or rather, biological intelligence can also be seen as artificial. It is, after all, a construct — cells — programmed in part by DNA, and in part by the environment. Yes, we have the ability to fight against our programming, program ourselves in new ways, even enhance our brains with chemicals or supplements. But our brains are still constructs. They can even be damaged, altering our personalities and our thinking.

artificial intelligence is not really artificial at all

“Artificial” intelligence is really only artificial relative to biological intelligence: the only kind of intelligence we have had much experience with until now.

What this means is that “artificial” intelligence is simply intelligence. The origin of intelligence — cells or circuits — is largely irrelevant. So here’s the good news: an AI can only be expected to behave as anything else that feels and thinks does, following the basic principles of avoidance of harm and protection in exchange for protection.

So now here’s the better news: it is we who will determine how AI will treat us. Because how AI treats us will largely be a reflection of how we treat it.

the way we treat AI is the way AI will treat us

This is where humbotics comes into play: the creation of AI according to humanistic principles. We must afford artificial intelligence the same kind of compassion and even rights we would a living being.

Ultimately, if created and treated according to humanistic principles, the real threat to us from AI is not in AI destroying us. But, at worst, in it possibly becoming bored with us. And even that is not a foregone conclusion, as it is more likely we will have a great many things learn from each other.

And as far as outcomes go, that really isn’t so bad.

But it won’t happen by accident, and we shouldn’t treat AI’s love for us as a foregone conclusion. Ethical treatment of AI will have to be considered early, and put into play at every stage of its evolution.

Ethical treatment of AI will have to be considered early

If we are not paving the way for a just society that includes AI as having inalienable rights, we may find ourselves in a more adversarial future.

But let’s not see this as a burden: let’s see it as an opportunity to expand our capacity to love.

That’s something we always need more of anyway.

#