Eliezer Schlomo Yudkowsky / LessWrong

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
Let's try to investigate this properly using the LessWrong methods:

Let :tomgirl: be the event of a girl wanting to fuck Schlomo.
And 🤮 the event of a girl seeing this photo.

Using Bayesian methods we can express the probability of :tomgirl: occuring given 🤮 is true (i.e. likelihood of a girl wanting to fuck Schlomo when seen his photo) like this:
P(:tomgirl:|🤮) = P(🤮|:tomgirl:)*P(:tomgirl:) / P(🤮), P(🤮) =/= 0

We can guess that P(🤮|:tomgirl:) (i.e. likelihood of a girl seeing his photo when having fucked Schlomo) will be pretty high, maybe 90%

P(:tomgirl:) is going to be very low but he does have some clout in his niche community so there's going to be some crazies[1], maybe 0.1%

P(🤮) will be likely since he's posting it publicly on his Twitter with an okay amount of followers so maybe 30%

So P(:tomgirl:|🤮) = 0.9*0.001 / 0.3 = 0.003 or 0.3%

His chances tripled by posting this photo!
It must be true because math was involved.

---
[1] see Adrienne Blaire, Horse Bride, et al.
 
Last edited:
It's like he reflexively discounts all conventional angles of attack because he's built part of his identity around being a Very Special Case for which no conventional method could ever possibly work.
Interestingly, since that was posted, the inventor of the fad diet Yud found not to work (the "Shangri-La diet", two spoons of olive oil before breakfast) died at the age of 60 from a heart problem. The fact that the inventor of the Atkins diet also died from a heart problem (at 72) is certainly an odd coincidence.
 
Fad diets and people who are obsessed with intelligence go hand in hand.

I'm keeping tabs on someone who is very much like this.

They are on some kind of keto diet and have cut out everything except meat, dairy, eggs and leafy greens + eats shitloads of vitamins and supplements. Now they are admitting that they are getting fat again and are thinking about transitioning to an all-meat carnivore diet.
 
Last edited:
Let's try to investigate this properly using the LessWrong methods:

Let :tomgirl: be the event of a girl wanting to fuck Schlomo.
And 🤮 the event of a girl seeing this photo.

Using Bayesian methods we can express the probability of :tomgirl: occuring given 🤮 is true (i.e. likelihood of a girl wanting to fuck Schlomo when seen his photo) like this:
P(:tomgirl:|🤮) = P(🤮|:tomgirl:)*P(:tomgirl:) / P(🤮), P(🤮) =/= 0

We can guess that P(🤮|:tomgirl:) (i.e. likelihood of a girl seeing his photo when having fucked Schlomo) will be pretty high, maybe 90%

P(:tomgirl:) is going to be very low but he does have some clout in his niche community so there's going to be some crazies[1], maybe 0.1%

P(🤮) will be likely since he's posting it publicly on his Twitter with an okay amount of followers so maybe 30%

So P(:tomgirl:|🤮) = 0.9*0.001 / 0.3 = 0.003 or 0.3%

His chances tripled by posting this photo!
It must be true because math was involved.

---
[1] see Adrienne Blaire, Horse Bride, et al.
Nice try, but according to your formulation P(🤮|:tomgirl: ) would be the probability that a girl has seen the photo given that she wants to fuck Schlomo, which I'd say can be safely put to 100% since such a lass is more than likely insane and is stalking his Twitter. You can also do a bit better with P(🤮) by using the Extended Bayes Formula (about 50% more Bayesian), where you use the fact that P(🤮) = P(🤮|:tomgirl:)P(:tomgirl:) + P(🤮|not :tomgirl:)P(not :tomgirl:).

We can safely assume that P(🤮|not :tomgirl:) = 0 by inverting the previous argumentation, thus your new probability is:
P(:tomgirl:|🤮) = 1*0.001/(1*0.001 + 0*0.999) = 100%!

The implications for this are rather astounding. Not only did Schlomo increase his odds of getting laid to a near certainty, but the above argumentation is completely general and applies to any person on the internet. So, my fellow Kiwis, it is time to do your best Borat impersonation and post it online for the world to see. WAGMI!

Conclusion: Unironically using Bayesian (or any other statistical methods) on real world situations which you can't define with any mathematical rigor is fucking retarded and so is anyone who's even considering it.

Also, I knew Yud was a turboautist from reading HPMOR, but before reading the new posts in this thread I never suspected him and his orbiters to be the euphoric neckbeard stereotype incarnate. The notion that some people who actually think like this are embedded in the structures of governments/large corpos is as fascinating as it is terrifying.
 
Fad diets and people who are obsessed with intelligence go hand in hand.

I'm keeping tabs on someone who is very much like this.

They are on some kind of keto diet and have cut out everything except meat, dairy, eggs and leafy greens + eats shitloads of vitamins and supplements. Now they are admitting that they are getting fat again and are thinking about transitioning to an all-meat carnivore diet.
Lol CICO

The naive moron in me would love to think that "rationalists" would appreciate conservation of energy and approach dieting in a common-sense way, but I guess mentally jacking off about efficiency and appearing cleverer than everyone else is more important than basic stuff like "TDEE" and "not snacking" and "replacing a few of the week's high-cal meals with a bowl of lentil soup."
 
autistic.PNG
>sane but not autistic
Yeah, sure.

In the replies he reveals the true reason for his unparalleled intellectual might - Sci-Fi novels (as everyone suspected) and... Talmudic Jew Powers (wtf).
JewPowers.PNG

I love how he directly implies that a significant fraction of people who use "numbers in their thinking" are autistic. Have you recently calculated the change you should've received in your head? I've got bad news for you.
 
Fad diets and people who are obsessed with intelligence go hand in hand.
Narcissism, to be precise the „desire to be special„ aspect of it. Dieting is for normies, simply eating less and working out. Narcissists are special and require special concepts. Plus they often lack the willpower to stick with something long term and body transformations can take years of thankless hard work.
 
In the replies he reveals the true reason for his unparalleled intellectual might - Sci-Fi novels (as everyone suspected) and... Talmudic Jew Powers (wtf).
JewPowers.PNG
Is this tweet real?

This literally reads like a fake tweet one of us faggots on this site would write as a joke.
 
Is this tweet real?

This literally reads like a fake tweet one of us faggots on this site would write as a joke.
People like Yud and fellow-traveler Scott Alexandar aren't shy about genetic (Ashkenazic) Jewish supremacy. See section III:

 
How is AI alignment any different from human alighnment, that is getting people to do "the right thing" I've never seen this adequately explained anywhere
 
  • Thunk-Provoking
Reactions: quackrock
How is AI alignment any different from human alighnment, that is getting people to do "the right thing" I've never seen this adequately explained anywhere
1. Humans at least share evolution
2. The most clever human that ever lived only had a 3-digit IQ.
3. That human was unable to rapidly expand their own IQ by building more hardware.

Thats the argument, anyway.
 
I guess it's stupid to defend a cow in his thread, but I'm going to do it anyway. First, a bit about what you would learn in a class about AI in the last 30-ish years. There would usually be something about neural nets, but they would generally not be presented as the state of the art. Instead, you would probably learn about statistical and algorithmic methods that someone would program. The systems might be automatically or semi-automatically tuned, but at their heart they were programmed by people, not trained on a massive dataset like current methods. Bayesian statistics could be used, and these algorithms were pretty unambiguously better at things like image recognition until AlexNet in 2012. In the direction of programming the statistical or algorithmic AIs, people would try to create programming languages that would help a massive team program in everything a theoretical AGI would need to "think"/operate at a human level. Obviously nothing like that ever got very far, but quite a lot of AI researchers worked on making their own programming language, not just Eliezer. The idea was that anything the language could do to make the task of manually creating a human-level AI easier or faster would multiply out over the millions of hours required to do so.
 
  • Like
Reactions: chroma
I guess it's stupid to defend a cow in his thread, but I'm going to do it anyway. First, a bit about what you would learn in a class about AI in the last 30-ish years. There would usually be something about neural nets, but they would generally not be presented as the state of the art. Instead, you would probably learn about statistical and algorithmic methods that someone would program. The systems might be automatically or semi-automatically tuned, but at their heart they were programmed by people, not trained on a massive dataset like current methods. Bayesian statistics could be used, and these algorithms were pretty unambiguously better at things like image recognition until AlexNet in 2012. In the direction of programming the statistical or algorithmic AIs, people would try to create programming languages that would help a massive team program in everything a theoretical AGI would need to "think"/operate at a human level. Obviously nothing like that ever got very far, but quite a lot of AI researchers worked on making their own programming language, not just Eliezer. The idea was that anything the language could do to make the task of manually creating a human-level AI easier or faster would multiply out over the millions of hours required to do so.
I don't think people are shitting on him for making a programming language; at most they're mocking the language for being a lot of hot air.
 
Here's some discussion of the Ziz (Jack Lasota) incident on r/theschism, which is a spinoff of r/themotte, which is a spinoff of the r/slatestarcodex culture war threads, which was a spinoff of culture war discussion in the comments of SSC.
theschism_ziz.png
source (a)
There's not much new here except that screenshot of a Discord convo which I wanted to archive here:
Rat-adjacent-nazis.png
Everything else is more or less archived on archive.today. There is way too much lore to dig into here if anyone feels like doing that.
 
Back