Discussion about this post

User's avatar
Sheila's avatar

Dang, I really gotta get my robot book done before the scientists all solve "what is a person" in a less sexy format than I was going to.

The thesis of that book is basically, a person is capable of saying no to you. That is to say, it has its own agenda which it follows regardless of orders. It initiates plans on its own and has preferences it follows.

In the book Service Model (highly recommended!), the robot is quite emphatic that he's not a person and he seems a lot less personlike than most fictional robots. He *can't* come up with his own tasks to do. But he eventually develops the capacity for malicious compliance: he can choose to do his tasks his own *way,* which I feel does count as agency.

But I think agency is necessary and not sufficient 🤔 There's more that goes into a person than that.

When I think of “what kind of being deserves rights,” though, I think more of the capacity to suffer. If it can't suffer, does it exactly need rights?

And then I think of the horrific dystopia it would create if we told the makers of AI (some of whom very much do want to create a person) that it doesn't count if the robot can't suffer.

The thing is that a test ceases to be useful once it's used as a *goal,* like if you trained a cat to respond a certain way to mirrors, that's not the same as a cat having a self concept. And if people create AI purely to meet one of the standards we come up with for personhood, we may find it achieved it without the underpinning of interior reality the standard was supposed to measure. Like, we've solved “being able to pass for human in an ordinary conversation” but it turns out that that tells us less than you'd think about what's going on under the hood.

These thoughts are kind of a tangent, sorry. But thanks for writing this post to explain the basics.

dynomight's avatar

It's interesting to think about what that prior over "humans are conscious" means. You can imagine thinking about:

(1) The probability you yourself are conscious, conditioning on the fact that you have access to direct evidence of your own consciousness.

(2) The probability other people are conscious, conditioning on the fact that you are a human and have access to direct evidence of your own consciousness.

(3) The probability that people are conscious, supposing you are an alien with direct evidence of your own consciousness, but with an alien biology which is sorta-kinda similar to a human.

(4) The probability that people are conscious, supposing you are an alien with direct evidence of your own consciousness, but with an alien biology which is very different from a human.

(5) The probability that people are conscious, supposing you are an alien (or AI) with a biology/design very different from humans, and/or without any access to evidence of your own consciousness.

Personally, I think I'd have to give (5) a much lower number than (1)...

12 more comments...

No posts

Ready for more?