10 Comments

Honestly, coffee tables bother me far less than the actual, literal castles. And that isn't even getting into the increasing focus on AI and far-flung futures that EA has given me no reason to believe their movement will make more or less likely.

EA's insight that we should care more (in material terms) about the people who are suffering most, and that we need to more rigorously measure the effectiveness of charities and causes, is admirably clear-minded. It's changed how I donate, and my goal is to donate more than I am currently. But given what I have seen of EA's apparent lived priorities, those donations will never be to an EA organization.

Expand full comment

I find the social aspect of frugality the hardest.

When I joined the EA community 10 years ago, I hoped to find support for a frugal lifestyle. I hoped to find people who would teach me to cook delicious lentil soup out of 4 ingredients, who would be okay if we meet at home or outside, rather than in a restaurant, and who would host each other on their couch.

That did not happen.

In the recent years I neglected my frugality habits due to EA and non-EA social norms, and an increased hourly wage. That might be a mistake.

Expand full comment

I think your defenses of specific pieces of EA culture have merit, but taken together I think there's a troubling picture.

To me, the biggest claim made by the effective altruism movement is that we should care about people (and animals) who are far away from us and disconnected from us. The "EA community" intends to be a community organized to encourage the best ideas and most effective strategies for helping those socially distant people. What worries me is that this is a hard thing for communities to do. It's much easier, more comfortable, and more natural for communities to organize around helping the members of the community themselves.

So when I hear that important people in the "EA community" live together and date each other, I think it's natural, but probably makes it really hard for cultural outsiders to be fairly considered for jobs, and maybe makes it hard to keep orgs from getting bloated since it's awkward to fire your friends. But that all might be fine if the jobs are really demanding and not super desirable. But if the jobs pay well and have great perks, now you have really valuable positions being distributed on the basis of close personal contacts. It's increasingly the community funding itself.

Even that might be fine if you have rigorous, clear ways of evaluating how effective EA orgs actually are at helping the outsiders they're intended to help, but it's not clear to me that rigorous, clear evaluation is possible, or that if it is possible that it's happening. Take Givewell, the org that I use to donate. Givewell has very clear and rigorous evaluation standards for its grantees, but I'm not aware of any clear evidence about how many lives Givewell itself saves per dollar of Givewell's spending, over and above what would be saved by directly sending the money to the Against Malaria Foundation and a few other well-regarded orgs and calling it a day. And how would you do such an evaluation? The best you could do would be to use Givewell's own models to price this out, but of course Givewell's models will provide an upwardly-biased estimate of Givewell's effectiveness if they have any errors at all.

And this is the big difference between a nonprofit and a burrito company. You don't need to know or care *how* Amy's makes their burritos because you can taste the burritos. You know what you're getting, and you know if you like it. And the important question is whether the burritos please *you*.

But when you're giving money to an EA organization, you're giving them money to spend on someone else. You can see what they say they did to help that person, but it's hard to a) verify that what's happening on the ground matches what the organization says, and b) to know whether what's happening on the ground is as good as you hope it is from the perspective of the intended beneficiaries.

Of course it's not as useful to look at how an organization spends its money as it is to look at what it produces, but it's a reasonable second-best option when the best option is difficult or impossible. And I think it should give us all pause if EA organizations have perks and pay in the same ballpark as google while having the kind of clubby cultures of small nonprofits.

Expand full comment

Having been a generous donor most of my adult life, I've been fascinated reading about EA lately. I've come to the conclusion that it's Veblenesque posturing by privileged, upper-middle-class techies who were lucky enough to be born into a boom that netted them 300-400K salaries just a few years out of coding school. Instead of conspicuous consumption, it's conspicuous giving, with everyone proclaiming publicly about how much they give and what they forego in their personal lives to be so generous to people less fortunate than them. Virtue signaling is a way to proclaim your superiority to mere mortals. It also helps that polyamory appears to play a central role in the EA culture – I'm not only morally superior, but I also get to have sex with people who normally would not give me the time of day!

It's interesting that so much of the giving goes to organizations in faraway lands, where donors can not see the results. Why would that be? Perhaps it's because donated money often does not have the intended effects. Those people you saved in Rwanda with mosquito netting could grow up to commit genocide. Other beneficiaries of controlling mosquito-borne diseases end up laboring in copper mines or starving in a famine. These unintended results are common and readily apparent if you donate your money to local causes, such as helping the thousands of homeless people in San Francisco. The benefits will be incremental; they don't solve the problems of mental illness, drug addiction, mental retardation and other situations that result in someone living in a sidewalk tent.

Alas, all good things must come to an end. SBF is headed to prison after he overreached and allegedly committed fraud to a bid to become the king virtue signaler. Tech companies are now mature corporations, and successful start-ups are few and far between. Those stunning tech salaries are now inching toward 150K, with many coders finding themselves out of work altogether as they reach their 30s and 40s.

Living in a moldy house with other EAers is a dramatic, monkish statement when you are single and in your 20s (except for the polyamory part). When you are older and have a family, not so much. It's curious that people whose parents sent them to top universities are now living in crumbling houses and presumably sending their children to substandard local schools so they can continue their virtue signaling in the face of reality. That's not bettering the world; it's crappy parenting.

While my wife and I have given regularly, we always considered our two children to be our primary charity. Totaling up rough numbers, I estimate we spent 300K in 10 years educating them in a private high school and top universities. In retrospect, I see my earlier comment about unintended consequences of donations also applies here. They grew up feeling entitled and privileged, and thus have not always made the best life choices. I try not to have buyer's remorse and avoid thinking about other endeavors I could have engaged in with that money.

Expand full comment

You don’t take ANY vacations?

Expand full comment
author
Jul 27, 2023·edited Jul 27, 2023Author

I mean, I have days where I don't work? I write a Substack, no one is making me write if I'd rather go to the movies.

Expand full comment

We don't need every single AI researcher to leave the Bay Area. If just Anthropic were to leave, that would be more than enough.

Expand full comment

Another potential solution: large-scale entryism into municipal politics in a particularly pliant municipality.

Expand full comment

Thank you, this helps

I still don't understand that I shouldn't care about coffee tables though, I'm a little slow on that part. I get that letting people have basic comfort is good, actually, but waiting to measure the outcomes way down the line when I could be micromanaging apocryphal 10k coffee tables seems like lenghtening the feedback loop without much gain (assuming it's not a big mental load for me personally)?

Expand full comment

"But you should be honest with yourself about it."

Isn't this kind of argument just asking for the not-actually-spoken reply, "I am honest with myself—just not with you"?

Expand full comment