Apologies for replying to myself, I am fumbling in my ignorance here, and genuinely curious if anyone could share any other valuable/interesting things from this "movement." In all other cases of people calling themselves "rationalists," it has been a huge yellow flag for me, as a fallibilist. :~]
I guess Dwarkesh Patel is part of that community? Well, his interviews are quite interesting, at least in the sense of seeing into a world that I otherwise don't see regarding AI researchers, and his questions are often quite good. Also, after interviewing many leading researchers and being on the hype train, he eventually did say a few months ago ~"yeah, the 'fast take-off' is not upon us," after trying to use leading tools to make his own podcast. That's intellectual honesty that is greatly missing in this world. So, there is that? I am also a huge fan of his Sara Paine pieces, at least on her part.
Is there anything else intellectually honest and interesting coming out of that group?
I was mildly interested in the movement but found it weird as well. Some causes seem good (eg fighting malaria), others like Super AIs just seem like geeks doing mental gymnastics over sci fi topics.
I have had a similar experience. I think one big problem is that EA often uses a low discount rate, meaning they treat theoretical people who won't be born for a century with similar value as people who are alive today. In theory that's defensible, but in reality it means you can hand wave at any large scale issue and come up with massive numbers of lives saved.
My church has a shower ministry, where we open up our showers to people without homes so they can clean up. We also provide clothes and personal supplies. That's just about the opposite of what EA would say we should do, but we can count exactly how many showers we provide and supplies we distribute and how those numbers are trending. Shouting "AI and asteroids!" is more EA, but it eventually devolves into the behavior you describe.
And even if it's "small stuff", I do believe acts of kindness are contagious, and lead to other people doing good deeds.
If we want to rationalize this EA style, we could say these small acts to have an exponential effect: 1 person can inspire 2 to be more selfless. So it's better to start propagating this as soon as possible, to reach maximum selflessness ASAP :)
From way out here, it really appears like maybe the formula is:
Effective Altruism = guilt * (contrarianism ^ online)
I have only been paying slight attention, but is there anything redeemable going on over there? Genuine question.
You mentioned "rationalist" - can anyone clue me in to any of this?
edit: oh, https://en.wikipedia.org/wiki/Rationalist_community. Wow, my formula intuition seems almost dead on?