True EA Alignment is Overrated

Epistemic Status: simple thought, basically one key insight, just broadcasting because I think people will find it useful.

Among the EA folks I talk to, there’s a fairly common recurring worry about whether or not they’re “truly aligned”. In other words, EAs tend to worry about whether they’re really motivated to do good in the world, or if they’re secretly motivated by something else that leads to EA-like behavior as a means to an end. The internal worry process generally looks something like this, take a look and see if this train of thought seems familiar:

“Yup, I’m an Effective Altruist. I care about doing as much good as possible.”

“Wait, is that really what I care about? What if I just care about my peers thinking I’m cool, or feeling devoted to a cause, or feeling like I’m a good person, or something?”

“How would I even know? In any of these cases, I would try to deceive myself to think I’m a True EA.”

“Oh, son of a biscuit, I’m definitely a fraud aren’t I?”

like a whole bunch of people

This is sort of a tough worry to address because it’s weird and meta and introspective. It seems worth an attempt though, because this train of thought can feel demoralizing for no good reason. To try and fix it, I find it helpful to imagine what the hypothetical True EA would say.

Who cares if you’re Truly Aligned or not? The only time you need to worry about this is if you’re doing some kind of hypersensitive work that depends on your being exactly aligned. Semi-alignment seems much more likely to come from human psychology, and it’s almost exactly as useful. Three cheers for semi-aligned EAs!

Peter Bostrom MacGreaves

I’m probably not a True EA. I’m not even sure True EA’s exist. But if the True EA did exist, they wouldn’t see me as a follower gone sadly astray, or anything like that. They would see me as a useful minion that maybe can’t be entirely trusted, but is still net positive and definitely shouldn’t be all demoralized about this funky introspective issue.

Three cheers for semi-aligned EAs!