Eliezer Schlomo Yudkowsky / LessWrong

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
He was presenting arguments for babies not being conscious and Destiny's even worse arguments somehow got him onto the topic of causal arrows, and how you would not be able to get any information about a world where causal arrows only entered but never left, and then he leaned toward the camera and said "isekai" is a stage whisper. He followed up a bit on it later, but I think he's advised on a couple fan fiction and pseudo-isekai projects, and has given general advise for people writing fiction, so I assume that's where he got it from.
Somehow, that's exactly how I thought that particular debate would look between these two, and it checks out with what I've seen of him on Ross' video.
You would think that with how much of a hot topic AI is these days, he would try to use more down to earth scenarios, like how quickly AI can learn to produce not just text but also audio and visual content to reach the common man whose job might be at stake. He would have a much bigger reach, even if he does believe in his retarded doomsday conspiracy, and it's not a stretch to most people how much more advanced AI could get in the coming years and how disastrous giving it control over nukes or wartime forces could be(Ukraine war is another relevant topic).
But no, isekai. Surprised he didn't mention his Harry Potter fanfic while he was at it.
 
Has he claimed to have a GF before?
1696316949217.png

(x)
 
IIRC he's married and he has kids maybe?
His "primary partner" has kids; a few pages back he was posting about entertaining one of them with a truly Islamic Elsa/Spiderman mashup. Which, naturally, he used AI to create. I know that when I'm existentially terrified of a piece of technology, I definitely teach my kids that it's a fun toy and associate it with cartoon characters. It really is a death cult.
 
His "primary partner" has kids; a few pages back he was posting about entertaining one of them with a truly Islamic Elsa/Spiderman mashup. Which, naturally, he used AI to create. I know that when I'm existentially terrified of a piece of technology, I definitely teach my kids that it's a fun toy and associate it with cartoon characters. It really is a death cult.
Nah, you're being retarded. Believing that homo sapiens are going to take over the world doesn't mean you're terrified of orangutans.
 
That's too sophisticated a line of thought for Eliezer "AI will literally take over factories and start producing uranium" Yudkowsky.
Uranium for what, nukes? Those haven't been on his radar for a long time. I don't think you've been paying attention if that's the example you use instead of "diamondoid bacteria", which he's repeated a hundred times in the past decade.
 
Uranium for what, nukes? Those haven't been on his radar for a long time. I don't think you've been paying attention if that's the example you use instead of "diamondoid bacteria", which he's repeated a hundred times in the past decade.
Making a synthetic virus to spread and kill everyone at the same time like he's theorizing is even harder than building nuclear weapons.
He's retarded.
It's time we stop giving clowns like him attention and money before they either poison the well of arguments against automation and ai like Alex Jones did, or some retard decides to car bomb a server farm.
 
Yud is perhaps oversharing details about his sex life

View attachment 5416104
Archive
Oversharing is par for the course for rationalists, they see no downsides to sharing personal intimate details with the entire world. Of course Schlomo has to do it here to defend his fellow rationalist and lolcow Aella.

Bonus: Anyone who doesn't like this just needs to "visit Earth in person"!
visitearth.png
source (a)
 
Last edited:
Back