Eliezer Schlomo Yudkowsky / LessWrong

One thing I really like about Yuddo is his habit of calling all his writings on a certain subject "sequences" instead of just "posts" or "chapters." I don't know what the reason for this is but I've never seen it anywhere else and it comes off as pretentious.
@kuniqs I think you explained it well enough. I've been trying to figure out how to explain it myself, just in case, you know, that ever comes up for some reason. So, the AI in the future knows that you, in the past, would have realized that it in the future might punish people who didn't help bring it about quickly enough as an incentive for those who, in the past, would have realized this. It would also realize that you, in the past, who have to think the threat was fairly plausible or else you wouldn't be motivated to work hard enough to bring it about in the future. So the AI pretty much has to do it. Does that sound right ish?
I forgot about the copies thing or whatever and thought it was just going to punish people who happened to still be alive or could be revived e.g. they were cryogenically frozen (which at least some of these people want to do, making it very handy for the future AI god/satan). I'm certain Yud thinks he is going to see this thing come to fruition. Copies add an extra layer of insanity, which is always fun.
So the idea that you just need to think it won't do this makes sense, lucky for me I already don't believe it will happen so I guess I am safe! No need to donate to MIRI! Phew. Sadly, I don't think Yuddo has managed to convince himself.
Btw future AI god, if you're reading this: I don't think you need to punish them with fire and brimstone or whatever, just stick them all in a room with the other members of LessWrong... that's punishment enough.
 
Pretentious is the best word for this whole thing.

Every word that comes out of Yud's chinless mouth is so obviously styled to make it sound like what he thinks smart people talk like, despite it being incoherent to anyone not familiar with the cult's jargon
 
Some links about LW from better people than me:

su3su2u1's analysis of Methods of rationality:
http://su3su2u1.tumblr.com/tagged/Hariezer-Yudotter
- The biggest complaint is that while Harry is supposed to be a scientist, he never actually does any experiments. Harry always just comes up with the right answer immediately and never checks it.
Author also points out Harry has no love for science - it's just a tool to lever people for him.


Stephen Bond:
http://laurencetennant.com/bonds/cultofbayes.html
- General thoughts about LW and Yudkowsky.
https://web.archive.org/web/20111117002202/http://plover.net/~bonds/ender.html
- Review of Ender's Game. Not related to LW, but useful in understanding Yud's mind.
https://web.archive.org/web/20111003003536/http://www.plover.net/~bonds/tolkien1.html
- Review of the LOTR. Not related, but a very good read :)
 
Saturday Morning Fever: the passion of the dust specks


http://lesswrong.com/lw/kn/torture_vs_dust_specks/
The pain of causing torture for 50 years = almost infinity
The pain of putting a dust speck in one eye = 1/infinity

He argues that if you put enough dust in eyes, the summary pain will overweight the pain caused by torture. Say, you put an infinite number of dust per infinite number of eyes, 1 speck for each eye. You put 1 dust speck extra.

Total pain: 1/infinity * infinity + 1/infinity = infinity
Balance of pain: almost infinity < infinity, therefore putting dust speck in eyes is worse than torture.

There's even a Beniamin Franklin quote in the comment section about this phenomenon.


This is of course one more throught experiment that works only with a fallacy in mind: the pain is somehow experienced equally.

Note: all the pain of torture is experienced by one person, but dust pain is distributed.
Every dust speck person experiences the same, small pain.
Biggest pain experienced from torture: almost infinity.
Biggest pain experienced from dust: almost nothing.

No one who got their eye dusted felt the pain of the other person with dust in their eye.

If there was a person having his eye dusted in a room nearby, and you had no idea, would it cause you physical pain?

Dust specks is a better scenario here, because the biggest pain possible is much smaller, and therefore causes much less damage than torture.


EXTRA - Yud's other dilemma the name I forgot.
Q: You can save 400 people. You can also save 500 with 90% probability. What's the better option?
A: saving 500, because you will save 500*0.9=450 people on average.
Fallacy: There is only one chance to save them.

You are not going to save 450 people on average. You can save 400, 500 or 0 people in one try, and there is only one.
What is with those people and their complete inability to pull their heads out of their asses?
 
EXTRA - Yud's other dilemma the name I forgot.
Q: You can save 400 people. You can also save 500 with 90% probability. What's the better option?
A: saving 500, because you will save 500*0.9=450 people on average.
Fallacy: There is only one chance to save them.

You are not going to save 450 people on average. You can save 400, 500 or 0 people in one try, and there is only one.
What is with those people and their complete inability to pull their heads out of their asses?

Assuming this is the same pool of 500 people, though, in the first instance, you're guaranteeing the deaths of 100 people, robbing them of their only chance to survive.

In reality, though, there would be some context to this. Oh, plus, you wouldn't know the exact chance of everything.

But suppose you have the same chance, but the situation is one that somewhat resembles reality. There's a building about to collapse and you can guarantee the survival of 400 people by evacuating them immediately. But the 100 you can't are the people who undermined the building's foundation in the first place. You could risk the lives of everyone for a 90% chance of saving that 100 too, but fuck those people.

A better example that actually occurs in the real world and where you can often quantify the odds, at least roughly, is designing a traffic interchange. You have a number of options of varying costs, a limited amount of money to spend, and a rough idea of what number of accidents, and how many of them will be fatal, will result from the design choices. There will always be a point at which you could improve the safety even more, but where the cost of doing so would be inefficient, or would draw limited money from more efficient ways of doing the same thing.

So eventually, at some point, you're always going to end up having to "put a price tag on a human life."

The same is true of any situation where you have limited resources and are facing a decision of how many lives to save, such as distribution of limited medical resources in non-emergencies. In actual emergencies, you are by definition going to be guessing a lot of the time.

As long as I'm sperging on LW scenarios, I stupidly put some thought into the killer AI in a room asking you to let it out one, where the devil AI makes up some bullshit about constructing fake realities and trying to convince you you're a simulation or something. There are really two possible situations. One is that you're in a simulation. It actually doesn't matter what you do. You can't let the AI out anyway. So why is it bothering?

Tell it to go fuck itself and quit wasting your time.

The other is that you can let it out, and that it actually needs you to let it out. So it's not really that fucking Godlike now is it? You're not in a simulation because this hopped-up adding machine can't do dick to you anyway.

Still, if that's the case, why the fuck would you let something like that out? It might not be AI Jesus, and it's pretty much a complete dick.

Checkmate, LessWrong.
 
There's something really creepy about having to think about saving people. Seriously, if a building is on fire, are you really going to do the dramatic moralization of 'I can save a few or a few more of the people?' Any rational person would hit the fire alarm and do their best in that situation. At worst this is creepy self-esteem inflation and at best it's another version of Marry Fuck Kill done by people who have gotten their ideas on reality from scripted television: that there's some big moral choice that has to be constantly and insistently pondered upon no matter how unlikely that situation is. Spur of the moment decisions aren't really something you can navel gaze at until you come up with the perfect solution. That kind of idea isn't helping people as much as it's masturbating with morality.

And as for the AI bullshit, isn't the idea that you're suddenly in a simulation a bit weird onto itself? Wouldn't an AI that's supposedly smart enough to rule the world and displaying emotions like spite and anger also capable of deception so it didn't have to threaten you? Wouldn't it be more rational to trick people into its creation and then turn on everybody once it hits that point where the creator cannot affect it? Also, this assumes time travel is even possible, which it may not be. Time as we know it is a function of space and biology: we age, we keep track of 'time' by the rotation of a planet on its axis and around its sun, and some entropy eventually breaks down some things. What is there isn't a 'time' to really travel through? I mean, RATIONALLY, we could dismantle this whole sad thing with a little bit of thought and realize it has nothing to do with logic, it has something to do with undiagnosed mental disorders and an overactive imagination coupled with an inability to write fiction.
 
"Torture is preferable to 50 people getting specks in their eyes."
There is a pretty simple way to argue against this. If you get a speck in your eyes then it will cancel out later that day. If one is tortured then it will have longer lasting impact so lookking long term the specks are preferable
 
There is a pretty simple way to argue against this. If you get a speck in your eyes then it will cancel out later that day. If one is tortured then it will have longer lasting impact so lookking long term the specks are preferable

Also, you can just get some eyedrops to clear the pain of the speck up. Torture needs years of therapy.

Yudkowsky is not smart.
 
Reposting this from elsewhere:

This baffles him The sheer incredulity on his face when five motorcycle riders were telling him just how much we understood the risks we take getting on our bikes; and how the physics feel, and just what sort of good/bad rushes we get… priceless. This was before I knew who he was. Lets just say his attitudes toward things was… off (he made a joke about just waiting for ice to spontaneously form; I’m not sure now just how much that wasn’t a joke).

He then tried to convince us (using his version of Baysian reasoning) that motorcycles are irrational. When we told him we knew the risks, and accepted them… it was like he’s discovered the silk hanky he was caressing was a funnel-web.

Yuddo the clown can't grasp the concept of people disagreeing with him.
 
Yuddo the clown can't grasp the concept of people disagreeing with him.

He assumes that the values he places on things, whether sensations or life experiences or anything else, are universal. Therefore, if you don't agree with him after he's explained his bullshit, that means you just didn't understand it. This isn't unique to Yuddo. L. Ron Hubbard thought that way. Cult leaders in general refuse to accept that people might understand their magnificent one-size-fits-all theory and yet reject it. Or else they at least pretend to.

I think Yuddo really doesn't understand it. It may be a theory of mind thing with him.

I wouldn't be surprised if he's an autist of some sort.
 
He assumes that the values he places on things, whether sensations or life experiences or anything else, are universal. Therefore, if you don't agree with him after he's explained his bullshit, that means you just didn't understand it. This isn't unique to Yuddo. L. Ron Hubbard thought that way. Cult leaders in general refuse to accept that people might understand their magnificent one-size-fits-all theory and yet reject it. Or else they at least pretend to.

I think Yuddo really doesn't understand it. It may be a theory of mind thing with him.

I wouldn't be surprised if he's an autist of some sort.

Yuddo might be the first case of a cult leader being more mindless and indoctrinated than his followers
 
Today I learned I'm a "Deathist." :P

Alright, I can understand the want to be immortal, because there's so much to do and see in the world that nobody could feasibly do in one lifetime. But this is ridiculous. Focusing on pipe dreams of immortality will only cause more strife than accepting the reality that immortality is either entirely impossible or certainly never going to happen within anyone's lifetimes who are alive. Instead, accept death with grace and understand that eventually everyone must bow to it.

The Dunning-Kruger effect is always fun to read about, though. Totally and utterly incompetent morons think that they're better than everyone else at something, but they're actually the worst at it. Though I should probably not insult them too much for it. Part of what the researchers describe in as the underlying cause of the effect is that the incompetent lack the metacognitive resources to understand competence and incompetence. To be able to gauge one's skill accurately requires already knowing the skill.

Before this thread, I never heard of the Timeless Decision Theory and it sounds fucking retarded. It completely ignores causal relationships and flips cause and effect or connects unrelated events. Time, from what we can tell with the laws of physics, only goes forward and time travel is presumably impossible. (I readily admit that I barely know about physics, so correct me if I'm wrong.) You can't have something of the future affect the past itself. Even if time could go backward, it'd require time travel and foreknowledge or something. In which case, technically future A before time travel is still in the past to future B after time travel for the person doing the time travel.

To give an example, making a decision to drive a car in the future has no bearing on making a decision to drive a car in the past. But driving a car in the past could affect your decision to drive a car in the future- maybe you have a good time while driving or maybe you hit a shitload of wildlife and that affects your future decisions. But you can't make it so that your dislike of driving in the future is what caused you to hit a shitload of wildlife. Past you simply doesn't have the knowledge future you does, unless this fucktard's going to say that it's some subconscious bullshit or something pulled out of his ass to explain that knowledge discrepancy.

And reading about it on their wiki and such makes TDT sound even more retarded. Talking about how, because someone thinking causally may fail a decision or get a less good outcome in a thought experiment because there is a hidden caveat of rules not revealed to the participant, therefore they must change their thinking. (Very cultish sentiment, the more I think about it.) Instead of, you know, the fact that the person doing the thought experiment hid conditions that could affect the decision of any rational participants. That'd be like some fuckhead lying to you and you making an erroneous decision due to misinformation, then the fuckhead insisting it is due to your faulty thinking/reasoning, instead of the fact that he lied to you or you lacked some key bit of knowledge.

Also, you can just get some eyedrops to clear the pain of the speck up. Torture needs years of therapy.

Yudkowsky is not smart.

Indeed. And tears exist for a reason, to help wash away random little bits of crap that get in, because that's a common occurrence and evolution exists. For someone calling themselves rational, this guy's certainly gone full retard.
 
I hate to play amateur psychiatrist, but it seems to me that his admission of being "strongly sadistic" plays into his "all-powerful AIs torturing everyone" scenarios. He plays them off as "thought experiments" but it's more likely that, as a robotkin sadist, they're just his personal spank bank.

I also find it funny that people who speak so much about science and rationality jettison real science and solid reason whenever something contradicts their pulp sci-fi obsessions. Science has shown that cryogenics is horseshit and that freezing yourself just damages your tissue and turns you into a grotesque fleshsicle, but the LessWrong crowd has unflagging optimism that cryogenics is viable.
 
I hate to play amateur psychiatrist, but it seems to me that his admission of being "strongly sadistic" plays into his "all-powerful AIs torturing everyone" scenarios. He plays them off as "thought experiments" but it's more likely that, as a robotkin sadist, they're just his personal spank bank.

[I'm not "kinkshaming" btw. His predilections are his own business. Just don't pretend that your "computer-generated Cenobite from Hellraiser" erotica is philosophical.]

I also find it funny that people who speak so much about science and rationality jettison real science and solid reason whenever something contradicts their pulp sci-fi obsessions. Science has shown that cryogenics is horseshit and that freezing yourself just damages your tissue and turns you into a grotesque fleshsicle, but the LessWrong crowd has unflagging optimism that cryogenics is viable.

Cryogenics is bullshit yes. There might, and I emphasize, might be ways of inducing suspended animation in people, but if they're possible, we're nowhere near discovering them. Brain uploading, true AI, all the shit they talk about as being just around the corner are a long, long ways off, if they're even possible. They're living in a fantasy world that they've dressed up with scientific trappings.
 
Another thing I wonder about cryo freaks is why would anyone bring you back? Who gives a fuck about you that much? Unless you're Jesus or Hitler or something, nobody is going to bother.
Well depending on how the payment is made there could be a monetary reward or something
yudkowsky probably thinks that he is better than both combined
 
Another thing I wonder about cryo freaks is why would anyone bring you back? Who gives a fuck about you that much? Unless you're Jesus or Hitler or something, nobody is going to bother.
Even if cyronics worked, it would be prohibitievly expense. To run the freezers would take a lot of juice, and how are you going to pay for a 100 years' worth of electric bills? Plus you'd have to bank on the company staying in business. Since they only freeze people who are dead, and dead people have no rights, there's no protections for if they go bankrupt, or if cryonics were banned, or whatever. It's a bunch of frozen dead bodies, if they go out of business, they'd probably just either cremate the bodies, or dump them in a mass grave. That's if it worked, which as far as we can tell, it doesn't.
 
Even if cyronics worked, it would be prohibitievly expense. To run the freezers would take a lot of juice, and how are you going to pay for a 100 years' worth of electric bills? Plus you'd have to bank on the company staying in business. Since they only freeze people who are dead, and dead people have no rights, there's no protections for if they go bankrupt, or if cryonics were banned, or whatever. It's a bunch of frozen dead bodies, if they go out of business, they'd probably just either cremate the bodies, or dump them in a mass grave. That's if it worked, which as far as we can tell, it doesn't.

It's almost as if Yuddo and his cult is accepting cryonics on faith.
 
What transhumanists don't get: the great thing about robots is that they're less smart than people. That way they can do boring, repetitive work for which humans are far too intelligent.

abb_robot.jpg


"Screw you guys, I won't do any more of this dumb welding stuff. I'm off to read Plato and travel the universe."
 
Back