Now that it's all said and done, let's talk about Effective Altruism (and why it is a societal cancer)
Educational Purpose Only
Preface: I had the great misfortune of living in an EA co-op when I first moved to the Bay because I was limited on housing options. Many of the EA organizations still run off of the dwindling store of fraudulent money that FTX pumped into them. As a result of this extremely poor housing choice, I know too much about this shitstain of a movement on society. You can see more about that specific saga in my first Reddit post if you're curious.
Ok, now for the post:
Sam has returned as CEO. The two Effective Altruists who were on the board have been banished to the shadow realm. Hooray, all is good?
Not exactly.
If you work in tech at all, you need to be on the lookout for anyone who presents themselves as "EA", "Rationalist", or uses words like "x-risk" or even a random buzzword like "deterministic". If they do, the chances they are part of this doomer cult of EA is pretty high.
Why is EA so damaging?
If you remember, Caroline Ellison, one of the central figures in the FTX fraud, gave testimony in Sam Bankman-Fried's trial. It's pretty well-covered in this article:
https://www.theringer.com/tech/2023...caroline-ellison-testimony-ftx-cryptocurrency
I want to highlight one specific part of that at the end, where Caroline talks about SBF having a different risk profile than other normal people.
Ellison said Sam had once claimed that “he would be happy to flip a coin, if it came up tails and the world was destroyed—as long as, if it came up heads, the world would be, like, more than twice as good.” When you’re assigning your own odds to everything, you can make them look however you like.
There's another article I can't find right this moment, but it covers Caroline speaking about their mentality being essentially that the ends justify the means. So if they can create greater net benefit for society later on, then ANYTHING they do is moral and just. This could mean literally anything. In this case, it was massive fraud, but in other parts of EA it has meant literal rape and domestic violence. When you think in this way, it is extremely dangerous and destructive. You can rationalize
anything you do as for the greater good.
Why is this important?
Because they were all Effective Altruists. Basically the entirety of FTX (well-documented at this point) was either EA or heavily EA-adjacent. This philosophy of
irrational rationalization is what allowed them to commit such serious crimes and still claim the moral high ground the entire time.
Well let me tell you man... the worst things you could ever do only look like the moral high ground if you're standing upside down, your head buried in the sand. Not a bad analogy for how EA people are.
But now the EA's are gone from OpenAI, so everything's good?
No. I want you to recall this article, which covers how the OpenAI board approached Anthropic about merging.
https://www.theinformation.com/articles/openai-approached-anthropic-about-merger
If you haven't heard of
Anthropic, it's essentially an OpenAI competitor but run from top-to-bottom by EAs. It was also funded to the tune of $500 million out of their total early funding of $700 million by.....
Alameda Research. The fraud trading arm of FTX.
Helen Toner and Tasha McCauley, the two board members who were forced to resign, were both EAs. But notice how Ilya is also not back on the board? Sam and Greg aren't either, but this is key.
Ilya is also an EA. And the interim CEO and former Twitch co-founder who they tapped to lead OpenAI after firing Sam? Emmett Shear?
Also an EA.
You can probably already understand where I'm going with this, but this is a massive conflict of interest where EAs are trying to gain widespread control of the tech industry, as well as gaining influence over other parts of society at large. Recall that SBF wanted to be President of the United States, if you read the Ringer article I linked above in full? Well this doomer cult essentially wants to amass wealth, influence, and power for "the greater good." But the philosophy that backs it allows them to commit acts of absolute criminal destruction as the means to it.
This is an incredibly dangerous movement that people NEED to be wary of.
Before Sequoia dumped $213.5 million into FTX, EAs were in large part not so influential in Silicon Valley and the world at large. Now, with many of the funds FTX stole from customers redirected into EA organizations like Anthropic, this has changed.
EAs now have a significant platform of influence in Silicon Valley. Even as major scandals that have hit global news cycles like FTX and now OpenAI being heavily driven by EA shittiness, they still retain that power. This isn't even to begin cracking the wave of smaller scandals like the outright misogyny, white wealthy privileged roots and racism, and string of sexual assaults in the EA community.
FUCK MAN. Somehow this movement, like their mentality itself, is unshakeable by all the writing on the wall and evidence of destructive behavior.
Ok, so what now?
I made this post because I didn't see the EA angle being talked about enough. That's what drove this shit. Even articles talking about the paper that triggered this conflict is too indirect, that paper was contributed to by one of the board members because they're part of this EA AI-doomer cult. That, plus their ties to a direct competitor in Anthropic are such an obvious conflict-of-interest that I cannot believe this wasn't exposed until now.
If you are in tech, and you see people like this, actively avoid them. The more we can avoid these people, the weaker their grip on influence and power to pull this kind of shit is. That's why I'm making this post. I hope to god that it gets seen.