Skip to content

Like this? Share it!

Book Excerpt

03.16.23

Nicholas Humphrey: Sentience: The Invention of Consciousness

Nicholas Humphrey: Sentience: The Invention of Consciousness

Buy the book

The following is an excerpt from the book Sentience: The Invention of Consciousness by Nicholas Humphrey.

Prologue

Hello? (hello) (hello)

A place for big thoughts is a hot-tub at night at the edge of the Mojave Desert. It’s a couple of hours drive from San Diego, where I’ve been at a meeting about human evolution. The lodge where I’m staying is surrounded by cactus-like trees. The Mormon pioneers called them Joshua trees, because they seem to stretch their arms heavenwards. Lying on my back in the gurgling water, I gaze at the stars and sink into the vastness of space.

Is there anybody there?

If there are extraterrestrial intelligent beings somewhere in our galaxy, they may be looking at the very same stars I am. Are they conscious of visual sensations like mine? Do they experience this phenomenal blackness, pricked with points of light?

I lie back, arms out sideways holding the rim of the tub. I feel the warmth of the water on my skin, smell the scent of the desert grasses. I’m flesh and blood. I’m soul. How can that be?

Hello, E.T., if you can hear me. Do you have this dual nature too? Is the light on inside your head? Do your sensations have the same eerie immaterial quality that mine do?

I want to think so. I want this to be shared.

I hear a coyote bark, then another. Where else on Earth does sentience reside? Do dogs feel pain like I do? Does an earthworm enjoy smells? Are machines ever going to have conscious feelings? Do they already? How could we know?

Barking again. Have the coyotes caught a rabbit? Poor rabbit. One minute she’s comfortably scratching her ear, the next a coyote has her by the neck.

What to say about the downside of sentience? The philosopher Schopenhauer wrote: “If the reader wishes to see whether the pleasure in the world outweighs the pain, let him compare the respective feelings of two animals, one of whom is engaged in eating the other.”

There’s a rock in the next valley, sculpted by the winds into the shape of a huge human skull. The hill in Jerusalem where Jesus was crucified was called Skull Rock – Golgotha in Aramaic, Calvary in Latin. Schopenhauer might have compared the feelings of two humans, one of whom is nailing another to a cross.

During the crucifixion it’s said that day turned to night. The stars came out.

But suppose conscious beings like us have not evolved anywhere else.

Suppose consciousness as it exists on Earth is a one-off accident of evolution.

Astronaut Frank Borman, looking from the window of Apollo 8, remarked “the Earth is the only thing in the universe that has any colour.” Can’t be strictly true. But it could be true that the Earth is the only place where sensations of colour exist. Or sensations of anything: sweetness, warmth, bitterness, pain.

Which would be better: a universe without either joy or tears, or a universe with both? Philosopher Thomas Metzinger agrees with Schopenhauer: the net utility is negative. He says that if an all-powerful and all-knowing “superintelligence” could look across the world of pleasure and pain, and do the sums, it would conclude it had a moral obligation to eliminate conscious life.

I think he’s wrong. We don’t live by bread alone. Pain and pleasure can’t be all that matters. But there’s no question that they matter. When we have reason to think someone is suffering we have a duty of care.

Some people think we have an equal duty towards any sentient being – human, nonhuman, even robot. It’s not self-evident. But it could still be a rule we choose to live by. In that case we have a heavy obligation to get it right about what in the world is conscious and what isn’t.

The Norwegian government permitted the drawn-out, ugly, killing of more than 1000 whales, including breeding females. But the Swiss government has made it illegal to boil lobsters alive, and the British government soon may do the same.

Descartes believed it’s only humans who have feelings. Non-human animals are all unconscious machines. That’s hard to believe. But maybe some animals are unconscious. A moth lands in the water of the hot-tub. I scoop it out and toss it aside. Descartes could be right about moths. I hope he is right about moths.

Could Descartes be right about extraterrestrials? What if the life-forms that exist out there are simply jumped-up insects? They might be ever so clever and still not have conscious feelings. I don’t believe there’s any necessary connection between intelligence and sentience.

Many people—famous philosophers among them—still don’t get this. They think that if an octopus can solve a picture-puzzle that a four year old child would have trouble with, it probably has sensations something like ours.

Frans de Waal asks “Are we smart enough to know how smart animals are?” He writes a beautiful book, Mama’s Last Hug. He’s sure animals have feelings on the same level we do. But what he comes up with as evidence is no more than a laundry-list of clever tricks.

What about the Argument from Continuity? People say evolution has been a gradual process, without sharp discontinuities. There won’t have been any point in history where we could draw a line: unconscious that side, conscious this. So some kind of consciousness must go all the way down.

Panpsychists – “conscious everywhere” theorists – believe consciousness is a basic property of physical matter. Even a tea-cup has a smidgeon of conscious feeling. Panpsychism seems to me a really bad idea. What would a smidgeon be like? Whose experience would it be?

Seems to me consciousness must either be fully fledged or not there at all. That’s certainly my experience of it. I move abruptly in and out of consciousness when I wake from sleep or fall back into it again. Why not a similarly abrupt transition in the course of evolution: a critical point when all of a sudden our ancestors woke up, the lights came on? One small step for the brain; one giant leap for the mind?

There’s a moving star, slowly arcing across the sky. No, not a star, a man-made satellite.

We human beings are on our way to becoming extra-terrestrial ourselves. Soon there will be sentient beings in space. But we, in our human bodies, won’t be able to go beyond our solar system. If we want to explore the stars, we’ll have to send intelligent robots in our place. Could these be sentient robots – machines that value their consciousness as we do. What extra ingredient would be required in their design?

Daniel Hillis has suggested the world-wide-web has already become conscious, simply as a consequence of its complexity. Only it hasn’t deigned to tell us yet. Could the world-wide-web be hurting? Do we have a duty of care towards it?

It’s often claimed that, as we build more and more complex robots, it will just happen: a threshold will be reached where sentience simply arrives as an emergent property – in the same way it happened in evolution. But I don’t think it did just happen like that in evolution. I believe circuitry had to be built into the brains of our ancestors by natural selection for the special purpose of adding sentience.

There are many who disagree with me. They point out – quite rightly – that sentience can have been selected for only if it makes a positive difference to survival. But, then, they ask, where’s the evidence it makes any difference at all?

Any difference at all? In my own case I want to say it makes all the difference in the world: the difference between being me and not being me! Yes, but wanting to say it doesn’t make it true. I could be kidding myself. There are a good many people who maintain that I am kidding myself.

Hmm. This is going to need some turning round.

Back home in the UK, a new “Animal Welfare Bill (Recognition of Sentience)” is under consideration by parliament. Clause 1 is the “animal sentience” clause. The Secretary of State says this will “embed in UK statute the principle that animals are sentient beings, capable of feeling pain and pleasure”. I see Sir Stephen Laws, former First Parliamentary Counsel, has commented at the Committee stage: “It is fair to say that all the concepts in Clause 1 seem to be problematic in one way or another.”

He’s right. It’s a philosophical, scientific, ethical and legal mess. As of now, we lack not only direct evidence but even agreed arguments as to how far consciousness extends. Arguably, the only sure case is our own, and every other is within reasonable doubt. Yet, all the time, we are obliged to act as if we know the answers.

Mary Oliver ends her lovely poem about whether stones, trees and clouds have conscious feelings, “Do Stones Feel?”, protesting that even if the world says it’s not possible, she refuses to concur. “Too terrible it would be, to be wrong.”

I understand her: the poet’s refusal to bow to impossibility, the insistent tug of the “what if?” What if stones feel? I may say I share the world’s opinion about stones. I’m as certain as can be that they don’t feel. But lobsters, octopuses?

Terrible to be wrong. Yes. But irresponsible not to be right – if we can only establish what right is. Let’s suppose we can discover how conscious feeling is generated in the brain, and how it shows up in animals’ behaviour. Then perhaps we’ll even be able to have a diagnostic test.

When Archimedes, in his bath, realised how he could test whether or not the king’s crown was pure gold, he leapt out and ran naked through the streets of Syracuse. I lie back in the tub in the desert and wait for that Eureka moment.

Philosopher Jerry Fodor has said “We don’t know, even to a first glimmer, how a brain (or anything else that is physical) could manage to be a locus of conscious experience. This is, surely, among the ultimate metaphysical mysteries; don’t bet on anybody ever solving it.”

Seems it might be a long wait.