Eliezer Schlomo Yudkowsky / LessWrong

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
Hilariously the big yud was criticized for writing this because he wasnt a nigger chimping out, at least all of these are actionable and many are at least reasonable ideas better than any hooting about abolishing the police.
 
Yud's the same guy who says "shut up and multiply", i.e. be a utilitarian and look at the raw numbers of casualties instead of going with your gut feeling. So it's a bit disappointing that he doesn't realize the amount of people killed by police (whether force was justified or not) is only at most double digits yearly, and the amount of people killed by crime and murder far dwarfs that number. Guess this is just the engineer's fallacy at work.

The closest rat-adjacent person who gives sane takes on the police is Graham Factor, he was cited a couple times on the SSC substack.
 
Yud's the same guy who says "shut up and multiply", i.e. be a utilitarian and look at the raw numbers of casualties instead of going with your gut feeling. So it's a bit disappointing that he doesn't realize the amount of people killed by police (whether force was justified or not) is only at most double digits yearly, and the amount of people killed by crime and murder far dwarfs that number. Guess this is just the engineer's fallacy at work.

The closest rat-adjacent person who gives sane takes on the police is Graham Factor, he was cited a couple times on the SSC substack.
Graham Factor is big fat cop apologist who should suck start his service weapon; it might get the taste of boot leather out of his mouth. The bastard defended the Uvalde cops and wants policing more akin to the european model of getting the shit kicked out of you to force confessions and thinks of the 4th as an impediment to closing cases.
 
Graham Factor is big fat cop apologist who should suck start his service weapon; it might get the taste of boot leather out of his mouth. The bastard defended the Uvalde cops and wants policing more akin to the european model of getting the shit kicked out of you to force confessions and thinks of the 4th as an impediment to closing cases.
Yeah, he swings and misses a lot, like all the other rat boys do. That Uvalde article was a disgrace. And obviously I would much rather live in a country with human rights rather than whatever horror show is going on in Europe. However something still needs to be done about the cultural diversity killing this country and part of it is because of rabid monkeys propping up BLM getting police defunded and worsening crime.
 
su3su2u1's analysis of Methods of rationality:
http://su3su2u1.tumblr.com/tagged/Hariezer-Yudotter
Backed up version - https://danluu.com/su3su2u1/hpmor/ (Archive)

su3su2u1 apparently got himself involved in some drama and took down his tumblr in 2016 after it was discovered that he used sockpuppets to sperg under rationalist blogposts, frequently lying about his expertise. Seems that he was comparably autistic to the less wrong spergs. Summary of the events (archive).

Would be interesting to find his original posts made in opposition to Yud's spergery, but all this is ancient history by this point. His analysis of Methods of Rationality is largely on point - the key issues plaguing the story (long and boring text dumps filled with incomprehensible jargon and a disturbing lack of actual science) are identified correctly.
 
Scott Alexander's followers at r/themotte have decide to break away and form their own site, based on the rdrama.net code.

1.png
2.png
 
I wish them all the luck in the world r/themotte is one pf the few places where you can debate issues and not have devolve into bannings for wrong think or niggers all around, its a special community.
So.... kiwifarms. And the next target if we go down then?
 
  • Like
Reactions: JJLiautaud
Oh shit. I always knew in the back of my mind these dweebs had a thread somewhere but I never bothered to read any of it before. And things are a little different now that AI is actually taking over. I discovered Schlomo in 2000 when I asked the old-school cutesy fun Google what the meaning of life is. As if that fucking AI knew what it was doing even back then, it answered with one of this kid's essays (he was 19 or 20 I think). (Google: "The meaning of life is to donate to MIRI so I get booorn. Do eeet.") As a young nerd who'd never read such a take on AI, and never heard of transhumanism, I ate it up. In those days all the action was on the SL4 list srv owned and operated by young Schlomo. As funny as the guy is, and as much as everything he ever ring-led made MENSA look like a humility seminar, I hate to say it but everyone underestimated him because he didn't produce what was (and still is) considered "AI research" by people who think they're less arrogant than he is, but aren't.

Yud himself stated that he made LessWrong because he realized the biggest impact he could make would be to form a cult of his brand of "rationality" and train up a whole generation of little Schlomos. Which is just what he did, and though there was a long lull after LessWrong 1.0 dispora'd, LessWrong 2.0 pulled a State of Israel 1948 and is going strong again. And all them little Schlomos are working wherever the hell they want (Google, Facebook, dedicate AI companies, NSA, etc) because you can be insufferable and still get those jobs if you're that type of person. Yud had a side gig advising hedge funds. Those types of people don't give a fuck about your education if you can be right enough to make them money. And big tech doesn't give a fuck either. Oh they do for mids. But if you actually have a 150+ IQ and know your shit cold they do not care about your credentials. How I know this is powerlevel.

Yud always attracted under-the-radar but effective people. The SL4 was mainly him arguing with Ben Goertzel all day every day. They ended up working together sometimes. The list was also full of Extropians (life extension, cyborgs, cryonics -- mark my words troonism might as well have been one of their projects, if it wasn't actually), and cypherpunks / pre-Bitcoin people like Wei Dei (b-money, precursor to Bitcoin) and Hal Finney (Satoshi handed off the Bitcoin project to him and it's widely speculated Hal worked on it too). Dei and Finney were plenty active on LessWrong 1.0 too.

All those LWers may have been arrogant Spock impersonators with all sorts of weird beliefs they had to hold, because the rationality made them. But they weren't stupid. Most of them were college students at top schools studying mathematics, computer science, and machine learning. They spent tons of time in that forum hashing out their ideas, with Eliezer always in the wings, hands-off but quietly refereeing. Where you think all those kids went when the graduated? Shane Legg cofounded DeepMind, the company that made AlphaGo and AlphaFold. Google bought them for a song. Half a billion. Microsoft paid 4 times as much for Minecraft. Maybe ESY isn't a "real AI researcher" whatever the fuck that means. But a shitload of his brain-children who learned most or all of what they know about AGI from him, whether they think they did or not, are "real AI researchers" with access to infinite resources. And whereas everyone was laughing at EY and the idea of AGI 20 years ago, everyone who matters is working on it and hiring for it now. And a lot of people still think he's a crank about the friendliness ("alignment") stuff. At least he's trying not to wreck civilization. Most of the rest are just going to go ahead and wreck civilization. Because again, arrogance. For as arrogant as he is, he has a spot reserved for humility. From everything I saw on LW, the rest of them just don't. They were born without it.

I guarantee you a disproportionate amount of those retards are troons, too.
 
But they weren't stupid. Most of them were college students at top schools studying mathematics, computer science, and machine learning. They spent tons of time in that forum hashing out their ideas, with Eliezer always in the wings, hands-off but quietly refereeing. Where you think all those kids went when the graduated? Shane Legg cofounded DeepMind, the company that made AlphaGo and AlphaFold. Google bought them for a song. Half a billion. Microsoft paid 4 times as much for Minecraft. Maybe ESY isn't a "real AI researcher" whatever the fuck that means. But a shitload of his brain-children who learned most or all of what they know about AGI from him, whether they think they did or not, are "real AI researchers" with access to infinite resources. And whereas everyone was laughing at EY and the idea of AGI 20 years ago, everyone who matters is working on it and hiring for it now. And a lot of people still think he's a crank about the friendliness ("alignment") stuff. At least he's trying not to wreck civilization. Most of the rest are just going to go ahead and wreck civilization. Because again, arrogance. For as arrogant as he is, he has a spot reserved for humility. From everything I saw on LW, the rest of them just don't. They were born without it.
Lol dude, those projects you're talking about are basically entirely silicon valley bullshit and hype. Google has heaps of money to piss away on dumb ideas for the 2-3% that actually manage to turn a profit at some point.

Like allegedly DeepMind finally turned a profit a year or so ago, except they're extremely quiet about how they're doing so. Almost certainly the few competent programmers they have on staff are being used to optimize google's advertising department or something like that. That's the profit they're showing.

The business model for this nonsense is to buy up a lot of companies with smart sounding sperglords. Then realize basically none of them know what they're talking about and have just fucked around with tensorflow and watched too many scifi movies about AI. And then carve up the companies and sell off 90% of the intellectual property to smaller VC firms (read: suckers), and retain the talent they can actually use for something practical.
 
So in May, our favourite rationalist intellectual escort tried to learn to code. It didn't exactly go well:
View attachment 3743460
(archive)
It looks like she sustained lasting damage:
View attachment 3743464
View attachment 3743466
Well, she's an incredibly insecure person who relies on SIMPs for validation. She thought she was a genius because "muh self taught data science" because data science is just writing surveys about people's sex lives.
The fact she can't use command line maybe was a reminder she's not as exceptional as she thought ( and therefore still subject to the rules and wisdom about normal humans - see female sexual marketplace: aging) . anything along these lines triggers a sperg out for her.
 
I dug through s bit of this escort's twitter history, and here's some interesting bits:

No need to plan for the future:

1.png
a

We will be able to just change our bodies instead:

2.png
a

A transcendent AI god will just make her into weird fetishes:

3.png
a

fuck u tradcaths:

4.png
a

Only wants to focus on one man at a time, does not like men who want to focus on many women at once. just_poly_things.jpeg

5.png
a
(smv = sexual market value)

Having trouble finding a man she actually wants.
6.png
a

She wants people to pay her to be a dating coach.
7.png
a

OnlyFans

8.png
a

Poly as a solution for cheating.
12.png
a

Dolezal positivity

9.png
a

10.1.png
10.2.png
a

p1.png
l | a

p2.png

l | a

p3.png

l | a

11.png
a

I've also attched a mp4 of her surgery results. Not sure how to inline it in the post though. The original tweet is here: l | a

"Conventional human morality"
13.png
a
 
Eliezer wrote a rambling post about how it's okay to keep the money that Sam Bankman-Fried stole and gave to Effective Altruism groups. One would assume that improving the lives of the million+ people who lost their savings be an act of effective altruism, but nope.

IMPCO, don't injure yourself by returning FTXFF money for services you already provided​

by Eliezer Yudkowsky 10 min read 11th Nov 2022

In my possibly contrarian opinion, and speaking as somebody who I don't think actually got any money directly from FTX Future Fund that I can recall; also speaking for myself and hastily, having not run this post past any other major leaders in EA:

You are not obligated to return funding that got to you ultimately by way of FTX; especially if it's been given for a service you already rendered, any more than the electrical utility ought to return FTX's money that's already been spent on electricity; especially if that would put you to hardship. This is not a for-the-greater-good argument; I don't think you're obligated to that much personal martyrdom in the first place, just like the utility company isn't so obligated.

It's fine to be somebody who sells utilons for money, just like utilities sell electricity for money. People who work in the philanthropic sector, and don't capture all of the gain they create, do not thereby relinquish the solidity of their claim to the part of that gain they do capture, to below the levels of an electrical utility's claim to similar money. The money you hold could maybe possibly compensate some FTX users - if it doesn't just get immediately captured by users selling NFTs to Bahamian accounts, or the equivalent in later bankruptcy proceedings - but that's equally true of the electrical utility's money, or, heck, money held by any number of people richer than you. Plumbers who worked on the FTX building should likewise not anguish about that and give the money back; yes, even though plumbers are probably well above average in income for the Bahamas. You are not more deeply implicit in FTX's sins, by way of the FTX FF connection, than the plumber who worked directly on their building.

I don't like the way that some people think about charity, like anybody who works in the philanthropic sector gives up the right to really own anything. You can try to do a little good in the world, or even sell a little good into the world, without signing up to be the martyr who gets all the blame for not being better when something goes wrong.

You probably forwent some of your possible gains to work in the charity sector at all, and took a bit of a generally riskier job. (If you didn't know that, my condolences.) You may suddenly need to look for a new job. You did not sign away your legal or moral right to keep title to money that was already given you, if you've already performed the corresponding service, or you're still going to perform that service. If you can't perform the service anymore, then maybe return some of that money once it's clear that it'll actually make its way to FTX customers; but keep what covers the cost of a month to re-search for a job, or the month you already spent searching for that job.

It's fine to call it your charity and your contribution that you undercharge for those utilons and don't capture as much value as you create - if you're nice enough to do that, which you don't have to be, you can be Lawful Neutral instead of Lawful Good and I won't think you're as cool but I'll still happily trade with you.

(I apologize for resorting to the D&D alignment chart, at this point, but I really am not sure how to compactly express these ideas without those concepts, or concepts that I could define more elaborately that would mean the same thing.)

That you're trying to be some degree of Good, by undercharging for the utilons you provide, doesn't mean you can't still hold that money that you got in Lawful Neutral exchange for the services you sold. Just like any ordinary plumber is entitled to do, if it turned out they unwittingly fixed the toilet of a bad bad person. Not fixing any more toilets for them is one thing; giving back the previous payment is another.

Trying to be a better person than average does not make it any more your responsibility to fix the broken pieces of the world elsewhere, than it is the electrical utility's responsibility to make FTX's customers whole for the money that FTX spent on electricity. The fact that you're trying to be a Good person doesn't make that responsibility land on you. It means you tried to be nicer than most people are, one day of your life, and nobody - definitely nobody who is not you - gets to demand that you now be nice on another day too. You do not, be it clear, get to say, "I was nice one day, so now I get to be nasty this other day" - we don't trust your accounting of that - but other people do not get to tell you, "Now you must be even nicer on this other occasion!"

FTX's misdeeds are not something that their electrical utility could've reasonably known in advance, and neither could you, because we're all distracted and competence in general in this civilization is in very short supply. If Sequoia Capital can get fooled - presumably after more due diligence and apparent access to books than you could possibly have gotten while dealing with the charitable arm of FTX FF that was itself almost certainly in the dark - then there is no reasonable way you could have known. We do not live in the world of Hieronym's To the Stars, or the locally better-known world of dath ilan partially inspired by it; we don't have AIs reporting to every seller on every buyer's moral probity before they sell anything, and frankly it's not clear we'd want to live in a world where that was true.

Miami sold FTX naming rights on a sports area. Miami has now terminated that agreement; maybe they'll give some of the money back; I sure bet they won't give all of it back. If you have any feeling deep inside like philanthropy is not a real trade - well, how real are the naming rights on a sports arena? I'd think selling a crypto company a $135M right to name a stadium, is more the sort of thing where all that money ought to be given back to defrauded customers, if you say some transactions aren't real enough. The reason some people don't immediately think that way, I'd say, is because they think people in the charitable sector all ought to be martyrs; and that is a way of thinking that I dislike. Miami would laugh at you, if you told them they should give all the money back to make FTX's customers whole. People go after the philanthropic service providers because such Good and self-sacrificing people are easier targets who might take that criticism seriously - not because it makes more sense on a moral level that sellers of utilons should come in line for harsh personal sacrifice, ahead of the sellers of stadium naming rights losing a chunk of revenue.

Does it potentially make EA look slightly better, if you visibly and nobly sacrifice everything and undergo great hardship? Possibly! But that is not necessarily your individual responsibility, to optimize EA's image at personal cost; unless you can afford to make it so, and choose to make it so. It's not even necessarily the cheapest way to buy that benefit. It's fine to have sold utilons for money, and say that you already traded those utilons for that money, and now EA needs to take care of its general image separately from that.


There's only one really strong argument I've heard for why Good people who traded indirectly with FTX should hurt themselves more, sacrifice more, in repairing this - and it's that FTX claimed to be trying to accumulate money in order to do good, and so now Lawful Good itself should try to deny them that benefit of ill-gotten gains, to discourage the next person who considers betraying Lawfulness in the name of Goodness, from the hope that they can purchase greater-goods that way.

And that's... actually a completely valid consideration, in my own view! Logical decision theory isn't actually something that works among humans, but you should almost always end up doing what LDT would say to do anyways.

But if you've got not a whole lot of money and it would cost you a lot of hardship, maybe leave that consideration to somebody else. Maybe the next EA billionaire can figure out exactly how much FTX Future Fund spent, add interest, and pay that much back to FTX's victims. It's not something you have to do alone, or at all, if you're not best-placed to do it. Be kind to yourself, ask what a friend would tell you, when you judge how much hardship it would be to repay; because I'm worried I know too many people who are going to be way too harsh about that. If you're thinking that somebody on another continent is poorer than you - well, they are, and they're also poorer than a lot of FTX's customers, and you can donate to GiveDirectly if you think that's best for Earth; mostly I'd say that you should maybe just be a little more Lawful Neutral about the whole thing. You're a utility company, is all, and you sold FTXFF some utility.

I'm actually a bit skeptical that this will have been done in the name of Good, in the end. It didn't actually work out for the Good, and I really think a lot of us would have called that in advance.

My current guess is more that it'll turn out Alameda made a Big Mistake in mid-2022. And instead of letting Alameda go under and keeping the valuable FTX exchange as a source of philanthropic money, there was an impulse to pride, to not lose the appearance of invincibility, to not be so embarrassed and humbled, to not swallow the loss, to proceed within the less painful illusion of normality and hope the reckoning could be put off forever. It's a thing that happens a whole lot in finance! And not with utilitarians either! So we don't need to attribute causality here to a cold utilitarian calculation that it was better to gamble literally everything, because to keep growing Alameda+FTX was so much more valuable to Earth than keeping FTX at a slower growth rate. It seems to me to appear on the surface of things, if the best-current-guess stories I'm hearing are true, that FTX blew up in a way classically associated with financial orgs being emotional and selfish, not with doing first-order naive utilitarian calculations.

And if that's what was going on there, even if somebody at some point claimed that the coverup was being done in the name of Good, I don't really think it's all that much of Good's fault there - Good would really not have told you that would turn out well for Good in even naive first-order thinking, if Pride was not making the real decision there. Nor need Lawful Good reimburse it, unless Lawful Good can afford to be really incredibly scrupulous about that.

But what actually happened with FTX, we don't know yet, and I'm not a prediction market that you should take my speculations seriously.


I say all this - well, first of all, because I think it's valid, and because I worry that people are going to be saying the opposite, that everyone must give all money back right now, as would make EA seem least vulnerable to imagined criticism (and some real criticism, to be fair, though the real criticism cares much less about what you actually do and much more about what they just wildly-guess / satisfyingly-imagine you're doing). I worry that people who hold the contrary viewpoint, that it's fine to just Lawful Neutrally trade utilons for money and undercharge to the extent you're Good, will feel inhibited and afraid from presenting this viewpoint if they hold it; and when that happens, in EA, I often do suspect that nobody else will dare to speak the contrary viewpoint, if not me.

But also I'm saying this, because I suspect / wildly guess that the pressure to sacrifice everything, to not reserve yourself for yourself, to always give and never trade, is possibly entangled with a kind of falling too far into utilitarianism; that in some but not all people there is a common mental motion between giving up yourself and your own needs and hurting yourself, and giving up the rules that should have held you back from hurting others. I worry that in the Peter Singer brand of altruism - that met with the Lesswrong Sequences which had a lot more Nozick libertarianism in it, and gave birth to effective altruism somewhere in the middle - there is too much Good and not enough Law; that it falls out of second-order rule utilitarianism and into first-order subjective utilitarianism; and rule utilitarianism is what you ought to practice unless you are a god. I worry there's a common mental motion (in some-not-all people) between hearing that they ought to ruin their suit to save a drowning child, instead of taking an extra 15 seconds to pull off the suit first; and people thinking that maybe they ought to go rob a bank, to save some children. But that part, about the psychology of some people who are not me, I'm a lot less sure of.

I do think that EA needs more of the spirit of people who say: "Yeah, you pay me, and I'll get the job done that I said I'll do, and then it's my money and I'll use it to make myself happy; and that I'm a good person shows in how I'm undercharging for this and maybe went and accumulated professional expertise in a field where I'd be underpaid, but I still own myself and what I'm paid is still mine." I wish that I had more of that myself, and aspire to it, even, so I feel a bit of a hypocrite in saying it, but still.

But that more abstract issue can maybe be debated later; it is not as urgent. For now, I wanted to express a possibly-contrarian opinion about what people, some of whom may be in some hardship about it, are (not) morally obligated to do.
Source (Archive)
 
fuck u tradcaths:
Literally diseased whore talking ethics because imaginary AI daddies will make it all better.

Also I would totally vote for Roko's Basilisk for President. Because the penalties if you didn't would be bad.
 
Last edited:
Back