Eliezer Schlomo Yudkowsky / LessWrong

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
SneerClub rarely posts about TheMotte these days so it's a pleasant surprise that I found this post. Prepare for some primo cow on cow violence!

First, the OP covers how a poster comparing homeless people to human garbage and wanting them killed got warned but not banned, while a poster criticising TheMotte for it got banned. Sneer!

The Sneer Club are a bunch of censorous faggots and should all be fedposted, theMotte prides itself on being an open debate club, and saying someone is should be banned for saying X is the quickest way to get banned there, and worst its backseat jannying of the worst sort.
 
If there's ever been evidence of a super AI running us in a simulation, the fact that this premier AI "expert" is named Schlomo is it

A curious thing I noticed at the very beginning of his talk is how Yud immediately starts emphasizing that he is the founder of AI alignment "when nobody else considered it worth it", he has been working on this problem since 2001 and he has failed. This comes off as rather self-centered and gives off an impression that his whole field may be something that one crackpot simply came up with.

He then proceeds to say absolutely nothing concrete for the next 11 minutes, even when directly asked for more precise predictions he weasels out anyway and keeps talking about how this topic is "difficult". Honestly, it seems that this guy only has superficial knowledge about AI/technology and science so he considers it to be basically magic. The only difference between him and your average redditor is that Yud is narcissistic enough to say that the reason he's not taken seriously by actual field experts is because the experts just don't understand his complex thoughts.
Schlomo used his enlightened Jewish superintelligence to convince himself that an AI would kill humanity not because of any concrete evidence or mathematical proofs or even any basis in computer science and AI research itself, but because he played out some thought experiments in his head where he desperately wanted humanity to die and came to the shocking conclusion that it's not difficult to kill all humans if you want to, then immediately assumed a future superintelligent AI would want to because we're made of matter. He then went around advertising this to the uniquely autistic Singularity crowd for years, playing to their fears of losing their robowaifu utopia. Any instance of an AI doing something strange or wrong is aggressively used to reinforce this narrative. To prevent this apocalypse, you have to completely undo industrial society and give him (and solely him) hundreds of billions of shekels so we don't die, goyim.
 
Last edited:
Are the kids alright?
screen-2023-07-11_16-19-24.png
 
Self-admitted autogynephile and gender critic Zack M. Davis just dropped a GIANT wall of text airing some dirty laundry about the rats' high leaders believing the lies of trannyism. (a)

No, I did not read all of it. Neither did many of the autists on LessWrong. (a) Here's basically what I could gather through skimming:
  • Back in 2014, Scott Alexander Siskind wrote a post saying "it's okay to say trans women are women because <long but ultimately flawed wall of text justifying this in rationalist terms>". ("The Categories Were Made For Man, Not Man For The Categories")
    • Ever since then, Zack Davis has written many a blog post arguing against this, as well as arguing against other rationalists who argued against him.
  • Eliezer Schlomo Yudkowsky in a tweet thread basically admitted he believes the tranny lie too. (a)
    • He's the last person you would expect to do this because back in 2008, as part of the Sequences, he wrote a post on about how absurdly difficult it is to actually become a woman (for starters, rewriting every single one of your chromosomes in every single one of your trillions and trillions of cells).
  • For several years, Zack tried to convince Schlomo and Siskind that they were wrong, but ultimately they just ignored him. He proceeded to have a mental and emotional breakdown over this (no, not kidding).
There's way too much to go through (Ziz is mentioned), but I'll just pull some good quotes:
"... Not Man for the Categories" had concluded with a section on Emperor Norton, a 19th-century San Francisco resident who declared himself Emperor of the United States. [...] But there's more to being Emperor of the United States than what people call you. [...]

What are you going to do if Norton takes you literally? Suppose he says, "I ordered the Imperial Army to invade Canada last week; where are the troop reports? And why do the newspapers keep talking about this so-called 'President' Rutherford B. Hayes? Have this pretender Hayes executed at once and bring his head to me!"

You're not really going to bring him Rutherford B. Hayes's head. So what are you going to tell him? "Oh, well, you're not a cis emperor who can command executions. But don't worry! Trans emperors are emperors"?
BOB: Look at this adorable cat picture!
ALICE: Um, that looks like a dog to me, actually.
BOB: You're not standing in defense of truth if you insist on a word, brought explicitly into question, being used with some particular meaning. Now, maybe as a matter of policy, you want to make a case for language being used a certain way. Well, that's a separate debate then.
⁕ ⁕ ⁕
If you were Alice, and a solid supermajority of your incredibly smart, incredibly philosophically sophisticated friend group including Eliezer Yudkowsky (!!!) seemed to behave like Bob, that would be a worrying sign about your friends' ability to accomplish intellectually hard things like AI alignment, right?
And that's why trans advocates want to mandate against misgendering people on social media: it's harder for trans-exclusionary ideologies to get any traction if no one is allowed to talk like someone who believes that sex (sometimes) matters and gender identity does not.
One thing in the Sequences was that the journey of a rationalist includes figuring out why your teacher is wrong, and breaking with them. It's time for Zack to learn he is way past that and to realize Schlomo and the other rats are insane on account of believing that a man can be a woman (today, not in an LWer's fictional transhumanist future). And that's besides his retarded AI doomerism.

In conclusion:
This is 100% correct.
 
One thing in the Sequences was that the journey of a rationalist includes figuring out why your teacher is wrong, and breaking with them. It's time for Zack to learn he is way past that and to realize Schlomo and the other rats are insane on account of believing that a man can be a woman (today, not in an LWer's fictional transhumanist future). And that's besides his retarded AI doomerism.
Honestly these fucking idiots make me wish there were some kind of GoFundMe or whatever to create Roko's Basilisk so I could help create that fucking thing and it would torture these weirdos for all eternity.
 
Honestly these fucking idiots make me wish there were some kind of GoFundMe or whatever to create Roko's Basilisk so I could help create that fucking thing and it would torture these weirdos for all eternity.
OH hey I was just about to announce my new charity fund...
 
This whole article reads like a giant middle finger to Eliezer.

Yes the site does get mentioned:
An arresting, dystopian “what if” scenario published at the LessWrong forum — a central hub for debating the existential risk posed by AI — posits a large language model that, instructed to “red team” its own failures, learns how to exploit the weaknesses of others.

My favorite paragraph which pretty much nails this whole entire endeavor:
We can also see in these science-fiction fears certain disguised hopes. The picture of intelligence as a frictionless power unto itself — requiring only Internet access to cause chaos — has an obvious appeal to nerds. Very few of the manifest indignities the nerd endures in the real world hold back the idealized machine mastermind.
 
This whole article reads like a giant middle finger to Eliezer.

Yes the site does get mentioned:


My favorite paragraph which pretty much nails this whole entire endeavor:
I think like any tech, AI has "cooled" down the fears:
  1. People didn't knew the limits or the accuracy, (power) of AI, so when they finally saw a large scale model being released early this year, people went apeshit, even more with voice and image generation
  2. People quickly found out the limits and general issues with the technology
  3. People started making fun or mocking its limits
  4. Inertia, most people don't even know how to use and not unlike the early PC boom in the late 70s and then the downturn in the 1980s, we got into a recession, boomers at management don't know how to use that tech and the economics forbid large scale investing
  5. China, AI hardware is gonna get more and more expensive, specially outside of the US to use, progress is gonna slow the fuck down, Nvidia can't cope with the current backlog of orders
So, when this retard comes online and says shit like we should cruise missile server farms in self defense right now or humanity won't survive the 2030s, he gets rightfully put into his original place as complete spastic, i think his last big break was in the Friedman podcast and i doubt he's gonna be invited back there soon.
 
I think like any tech, AI has "cooled" down the fears:
  1. People didn't knew the limits or the accuracy, (power) of AI, so when they finally saw a large scale model being released early this year, people went apeshit, even more with voice and image generation
  2. People quickly found out the limits and general issues with the technology
  3. People started making fun or mocking its limits
  4. Inertia, most people don't even know how to use and not unlike the early PC boom in the late 70s and then the downturn in the 1980s, we got into a recession, boomers at management don't know how to use that tech and the economics forbid large scale investing
  5. China, AI hardware is gonna get more and more expensive, specially outside of the US to use, progress is gonna slow the fuck down, Nvidia can't cope with the current backlog of orders
So, when this retard comes online and says shit like we should cruise missile server farms in self defense right now or humanity won't survive the 2030s, he gets rightfully put into his original place as complete spastic, i think his last big break was in the Friedman podcast and i doubt he's gonna be invited back there soon.
Such is the cycle with all tech. Something new is discovered and like a kid finding a new patch of woods, for a while people imagine it is endless with untold possibilities.

But then as the years past the boundaries of the tech are discovered and many fears and dreams are discovered to have been overblown.

The point about the "nightmare" really being a nerd's dream has really crystalized it all for me. The entire roko's basilisk is just the nerd dream of "some day my god of intellect will take revenge on you all!"
 
The point about the "nightmare" really being a nerd's dream has really crystalized it all for me.
That's obviously been true the whole time. An "aligned" ASI would be God, and an unaligned ASI would be anti-God. Due to Foom and various handwaving, within months of this "aligned" ASI God creating itself, all the nerds would get uploaded to computer simulation heaven. Obviously, people would ask "aligned to who exactly", but when the nerds say "aligned", they mean aligned to them, in order to create their preferred computer simulation paradise. The other people who don't want to be run in a computer or don't like the nerd version of heaven can go screw themselves. Obviously, suffering (defined by us) can't be allowed!
 
Back