Discussion about this post

User's avatar
John Quiggin's avatar

I get the first two. I totally don't understand why effective altruists should regard AI (rather than catastrophic climate change or nuclear holocaust) as the biggest threat to humanity. This linkage has been a disaster, certainly in PR terms.

Expand full comment
Anonymous Dude's avatar

This is quite good. While I wouldn't quibble with 2 (Think with numbers) or 3 (We don’t know what we’re doing—but we can figure it out together), and I would with 4 (You can do hard things) but that's more of a statement of personal inferiority than any real critique of the actual ideology, I would argue that 1(Don’t care about some strangers more than other strangers because of arbitrary group membership) is probably one of the greatest weaknesses of EA as far as the real world goes.

Because dang it, people do, a *lot*, and it's so constant throughout history it's probably biologically based; the only thing that really seems to keep it in check historically at all is religion (and even then only universalizing ones like Christianity or Buddhism, and of course they do the ingroup thing too). A term I found useful is 'universe of obligation'--basically, the people you're supposed to care about. The EA theoretically wants to extend the universe of obligation to everyone human (and in some cases beyond), and most people aren't built that way. Acting like you can turn most people into EAs is much like ignoring the biological reality of sex (which you advocated against elsewehere: https://thingofthings.substack.com/p/notes-towards-a-sex-realist-feminism).

Thank you for laying out the whole thing so clearly.

Expand full comment
9 more comments...

No posts