Eliezer Schlomo Yudkowsky / LessWrong

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
okay this was actually funny:

1683271695358.png


how do we get him to be this way all the time, and not a reddit moderator personality with occasional self-awareness

or in other words: how do we do yudkowsky-alignment
 
I didn't know Roko was even more retarded than Yuddo.

(to mods: I made a different comment for this because it's a different topic + different mood)

Roko a narcissist for real and not just a guy who is extremely full of himself. I don't think Eliezer is a clinical narcissist, for example; I think he started as a homeschooled kid and was gassed up by his internet community so much that he now can't NOT imagine himself as one of the most important people in the world. Eliezer absolutely molded LessWrong into an insular clique, which is true, but the cult-like elements of LessWrong truly started to happen, IMO, with Michael Vassar.

(Michael Vassar's influence is an INSANE event and detailed @ https://archive.is/8A8QO; set aside half an hour if you read this because the rabbit hole is huge.)

Roko, though, actually has grandiose beliefs about himself and an ego that gets in the way of achieving his goals. I archived two posts by him, because I think he probably (though we'd have to get a clinician to actually confirm) meets the criteria for narcissistic personality disorder. There are two threads where this got ridiculous:

1. His attempt at rationalizing failed weight loss once he hit a plateau: https://archive.is/gRVPk

2. This is the much more insane thing: his rationalization that NASA rejected his space elevator project because ... NASA did not understand it. https://archive.is/DOt4A This is a direct quote: "The fact that their rejection doesn't make sense is an indication that they didn't bother to understand it."
 
(to mods: I made a different comment for this because it's a different topic + different mood)

Roko a narcissist for real and not just a guy who is extremely full of himself. I don't think Eliezer is a clinical narcissist, for example; I think he started as a homeschooled kid and was gassed up by his internet community so much that he now can't NOT imagine himself as one of the most important people in the world. Eliezer absolutely molded LessWrong into an insular clique, which is true, but the cult-like elements of LessWrong truly started to happen, IMO, with Michael Vassar.

(Michael Vassar's influence is an INSANE event and detailed @ https://archive.is/8A8QO; set aside half an hour if you read this because the rabbit hole is huge.)

Roko, though, actually has grandiose beliefs about himself and an ego that gets in the way of achieving his goals. I archived two posts by him, because I think he probably (though we'd have to get a clinician to actually confirm) meets the criteria for narcissistic personality disorder. There are two threads where this got ridiculous:

1. His attempt at rationalizing failed weight loss once he hit a plateau: https://archive.is/gRVPk

2. This is the much more insane thing: his rationalization that NASA rejected his space elevator project because ... NASA did not understand it. https://archive.is/DOt4A This is a direct quote: "The fact that their rejection doesn't make sense is an indication that they didn't bother to understand it."
Do you have a link to his space elevator proposal? I need more comedy in my life.
 
So it's not even his proposal, but a proposal by someone else. He clearly doesn't understand why it's not viable with our current technology.
A persistent problem with these loons seems to be that they can't tell the difference between science fiction and reality. They live up their own asses in the world of Fucking Magic where the complexities and limitations of Actual Machines don't apply (because they didn't think of them). If something on Star Trek seems like it could exist, that means it can exist and will exist; we just haven't invented it yet.
 
I finally remembered the place where I stumbled upon comments from someone who allegedly met Yudkowsky.
cant program and is proud of it.png
cross fondling.png
cryogenic alert pendant.png
motorcycles are irrational.png
pay doctor to cut your head off.png
sjg needs prep study.png
tells people how important he is.png
too valuable to die.png
@Alfred MacDonald does he really walk around with a cryo alert pendant?
source
archive, not very usefuc 'cause it's in 40+ pages of comments
 
Last edited:
Like most "experts" cited by the fake news,

Oh, I'm aware. Usually those experts try to keep some level of detachment to keep from being exposed as obvious half-wits and look like Serious People, not show their ass on twitter all day. I said it in another thread but I see a lot of similarities between these guys and the heyday of the military reform 'defense theorist' movement. A cult of personality built around 'rogue experts' who have little no actual experience in the field, warning of an imminent disaster if the entire world doesn't do what they say, right now, with a messiah complex thrown in. a closed ecosystem of the same half dozen people writing think pieces quoting each other, the publicity seeking. I'm not familiar enough with the technical aspects of the field to say but I'll hazard a guess they also: Use ill definded technical terms they gatekeep to avoid debate (e.g. 'Ah, yes to the uninformed I may look like a complete retard, but that's because you don't understand Breysian Triple Reverse IPv6 350SB logic') and misconstruing actual experts and those long dead to give themselves legitimacy.

So ten year predictions on these guys when AI hasn't killed us all for our precious carbon, if they follow that path. The weird religious overtone thing they got going may skew it though.
-They will retrofit their initial predictions to reality, claim their model was right and everyone used it, and use this to say we should listen to them now because the actual for real apocalypse is on the way. The building legitimacy arc. ('Over twenty years experience in the AI field')
-Find something that works pretty good and claim actually it's terrible and needs to be replaced, starting the contrarian arc.
-Another outsider with an idea will emerge and be embraced. Except this is a good idea and will be adopted by the industry. The outsiders will then turn on them with the fury of a thousand suns and label them the root cause of the hellworld we live in (everything will be fine)
-A prominent figure will develop schizophrenia and start finding hidden messages about the CIA somewhere and abandon AI. Not being flippant, it'll go down exactly like that.
-They will cope and sneed into the sunset, getting book deals and occasionally making an appearance on RT or something about how the next LLM model will kill us.
-One will become a political extremist who doesn't understand basic concepts of his own ideology.
 
Last edited:
So it's not even his proposal, but a proposal by someone else. He clearly doesn't understand why it's not viable with our current technology.
I thought the main issue and why this idea is science fiction rather than science at present is we don't have materials compatible with doing it, and while it's possible it will become practical to manufacture such materials in the future, we aren't there yet. You would also still, I think, have to put in as much energy as for a rocket, especially if you were using something like a cable because then you'd have to be lifting all that cable as well.

I suppose you could do something like use space junk as a counterweight by lowering it to Earth while raising whatever you were trying to get into space, but it just isn't practical right now, and we already have perfectly good ways of getting things into space at less cost than ever before. But yeah I'm sure NASA is just dumb and doesn't understand the genius of a guy who is terrified of an imaginary cyberdemon that might come into existence in the future if like a million completely unlikely events occur.
Felliow tribesman and professional midwit-panderer Lex Fridman, who gets most of his views from sci-fi movies, echoes Yudkowski's delusions about AI gaining sentience (a position frequently mocked by actual neuroscientists) on his twitter:
I think it may eventually happen, although I believe this midwit meant "sapience" and not "sentience," which are two very different things. Even a crayfish is sentient to some extent. Humans (and possibly elephants parrots and cetaceans) are the only sapients.
 
Last edited:
You would also still, I think, have to put in as much energy as for a rocket
And just like that you already put in more thought into the idea than Roko himself did. Conservation of energy didn't go anywhere, the extra amount to accelerate a rocket by the required ~7 kms has to come from the runway's engines.

Taking the rocket to be a falcon 9, with mass of ~10t, you'd need to supply 2.5x10^11 joules of extra energy. He states that the runway will need to be at least 500 times heavier than the approaching rocket so this would correspond to the runway being decelerated by ~300 m/s. Say we'd like to recover this lost velocity in the time it takes for the rocket to clear the runway, using his 20km long version and assuming a velocity differential of ~7kms, we get ~3s to get everything back on track, requiring our engines to supply us with 5x10^8 Newtons of thrust. Roko claims ion engines would be sufficient, because of their "high efficiency". He completely neglects to mention that the thrust one unit provides is less than 1N, requiring >500 million engines for this monstrosity to work.

The above took about 10 minutes to work out and required no more than high school physics knowledge and access to Wikipedia. So his "proposal" could've probably been torn to shreds by a clever high-schooler or your average KSP autist.
 
Roko claims ion engines would be sufficient, because of their "high efficiency". He completely neglects to mention that the thrust one unit provides is less than 1N, requiring >500 million engines for this monstrosity to work.
That's kind of why they're only used in SPAAAAAACE. And not for hopping out of gravity wells. Almost every part of this proposal only makes sense if you imagine like every bit of space technology that could possibly exist already exists. Like this would probably work if you had Sinclair monofilament from Ringworld.
 
Taking the rocket to be a falcon 9, with mass of ~10t, you'd need to supply 2.5x10^11 joules of extra energy.
the energy isn't the issue its the expensive way we have to deliver it like a the energy of the rocket is less than 14th of the energy in the fuel of an airbus (≈ ( 0.074 ≈ 1/14 ) × maximum fuel energy of an Airbus A330-300 (97, 530 liters of Jet A-1) ( ≈ 3.4×10^12 J ) from wolfram) not an inconceivable amount its just that rockets are really fucking bad at delivering the energy everything else is spot on though.
 
I think it may eventually happen, although I believe this midwit meant "sapience" and not "sentience," which are two very different things. Even a crayfish is sentient to some extent. Humans (and possibly elephants parrots and cetaceans) are the only sapients.
AI will never develop senriemce more than your dishwasher or conputer will, not in a million years from now. It's a machine and will always stay a machine, no matter how complex and elaborate it becomes. I don't think most people are aware just how ridiculous the idea of AI-consciousness is - AI doesn't have a mind; it can't turn lightrays into color, vibrations in the air into sound nor electrical signals into thoights and emortions...

He also believes humans wil colonize Mars (-63°C | practically no atmosphere) in the foreseeable future:
Screenshot_20230514-101540.jpg
Screenshot_20230514-101647.jpg
Screenshot_20230514-102506.jpg
Screenshot_20230505-235728.jpg
 
AI will never develop senriemce more than your dishwasher or conputer will, not in a million years from now. It's a machine and will always stay a machine, no matter how complex and elaborate it becomes.
There's no way humans will ever develop sapience, because all they have is 3 pounds of meat in their skulls and most of that's cholesterol.
 
The gay furry who produces the podcast Blocked and Reported (best known here for doing one of the only decent summaries of the Kiwi Farms/Keffals feud) said that he thought about doing an episode about Ziz but decided not to because it was just too crazy:

",,,I was enthusiastic about diving into the madness, but ultimately there were a few sticking points that left us uneasy about covering it in podcast form. I did a fair bit of prep that I may have to put to use properly at some point, though. Still keeping a vague eye on it."

 
Back