Discussion about this post

User's avatar
Ghatanathoah's avatar

My understanding of sapience is that it is not neccessarily a function of raw intelligence. For example, in "Star Trek, the Next Generation," Commander Data and the ship's computer are both capable of solving problems and speaking in grammatically correct sentences. However, it is clear that Commander Data is sapient, but the ship's computer is not (except in that one episode where it temporarily becomes sapient). They behave differently and approach problems in different ways. Right now the AI we've designed seems more like the ship's computer than like Commander Data.

The analogy to children may or may not hold. Children have their own desires and ideas of how to live a fulfilling, eudaemonic life. The danger of paperclip maximizers isnt that they find making paperclips to be the best way to achieve a eudaemonic life, it's that they care about paperclips, not eudaemonia. They aren't like a person who values different things in their life than you, they dont value their lives at all.

Expand full comment
SkinShallow's avatar

On what basis do you conclude that AIs (without bodies and hormones and thus emotions) would be able to FEEL anything? Or do you use "love X" as a nearest-approximation of "strong overall tendency to choose actions that increase the likelihood of X occuring or promote its growth"?

Expand full comment
11 more comments...

No posts