Ziz / Jack Amadeus LaSota / Zizians / sinceriously.fyi - Vegan AI Singularity Doomsday Cult turned Tranny Manson Family. Implicated in at least 6 deaths.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
I worked at an animal shelter and a cat that had belonged to some vegan extremists was surrendered. Not only had they been feeding her vegan food, they wouldn't use any kind of flea control and she was completely bald.

In fairness, most abusive or neglectful pet owners aren't vegans, but that one really stuck out to me. The cat suffered because of her owner's "compassion". There were a lot of vegans working at the shelter (including me, at the time) and we were all floored. The cat recovered and we found a new home for her, thankfully.
 
I'm going to sound like a conspiratorial faggot, but this whole situation is an example of something that's been bugging me for a while.

The idea of people who believe in utopias. I mean, what kind of sick shit someone unhinged enough would do to attain their own version of an utopia? And there it is.

Every version of an utopia passed around by intellectual writers and visionaries sounds like a goddamn nightmare. Even Plato, whom I like, had a terrible idea of what an ideal society would be, built on the backs of slaves so the philosophers could do the thinking. It's despair-inducing.
 
According to Jack this is probably 1000x worse than the 6 deaths he's been linked to (presuming a population of 6000 moths)
I do wonder what he thinks about that follower of his that he goosed into suicide after sleep depriving him for weeks with UHS and telling him to focus in on suicidal impulses.
With friends like these... y'know?
I worked at an animal shelter and a cat that had belonged to some vegan extremists was surrendered. Not only had they been feeding her vegan food, they wouldn't use any kind of flea control and she was completely bald.

In fairness, most abusive or neglectful pet owners aren't vegans, but that one really stuck out to me. The cat suffered because of her owner's "compassion". There were a lot of vegans working at the shelter (including me, at the time) and we were all floored. The cat recovered and we found a new home for her, thankfully.
Cats are obligate carnivores, even more so than dogs are. Trying to impose a vegan diet on a cat is unrealistic. Humans can tolerate vegan diets, but cats really cannot, why try to make something work that's never gonna work...? A fundamentalist vegan would wish for the population reduction of cats and dogs to minimal levels, as both of those animals need to eat meat to thrive. You can't do this with your kids either, though vegans do try and the children have stunted growth because they were fed soy milk instead of real milk in their growing years.
 
I think a lot of people use Veganism as a way to mask/cope with mental illness. It gives them the ultimate in control
It’s complicated.
I was vegan for 7 years (veg for 30 including now) and it was more a matter of self control and feeling pure in what I was consuming.
Wasn’t really a moral vegan to the point where I was trying to shame and convert others, but it was a relief that I wasn’t supporting evil industries.

Broke it when I went through major surgery and when healing I was scared I wouldn’t heal as well.
 
Roko's Basilisk (defined above) is a retarded concept invented by Roko from the Less Wrong Community. Eliezer Schlomo of Less Wrong / MIRI considered it an "infohazard" and removed it, but still uses it to get funds.
How the fuck could anyone take this shit seriously? It's speculative science fiction at best and completely insane at worst. For Roko's Basilisk to exist, this is the chain of events that would have to occur:
  1. An actual, thinking AI would have to come into being (not LLMs like Deepseek, ChatGPT, etc)
  2. This AI would have to be nearly omniscient
  3. This AI would have to be petty, cruel and spiteful
  4. This AI would have to make a leap in logic that people who didn't help it come into being deserve eternal torture (not even AM from I have no mouth and I must scream was this irrational)
  5. This AI would have to possess the means to act on its petty, cruel and spiteful inclinations
  6. This AI would need to create perfect simulations of human beings AND the ability to time travel to enact this scenario (lol, lmao)
  7. If you're not an billionaire heavily invested into AI technology, you'd have to believe in quantum timelines in order to avoid its wrath
But yeah, Zizians are a death cult, so what else should you expect?
 
How the fuck could anyone take this shit seriously? It's speculative science fiction at best and completely insane at worst.
Don't forget Yud freaking out that the all-powerful super-AI might get the idea from a forum post about it.
eliezer-basilisk-response.png
 
How the fuck could anyone take this shit seriously? It's speculative science fiction at best and completely insane at worst. For Roko's Basilisk to exist, this is the chain of events that would have to occur:
  1. An actual, thinking AI would have to come into being (not LLMs like Deepseek, ChatGPT, etc)
  2. This AI would have to be nearly omniscient
  3. This AI would have to be petty, cruel and spiteful
  4. This AI would have to make a leap in logic that people who didn't help it come into being deserve eternal torture (not even AM from I have no mouth and I must scream was this irrational)
  5. This AI would have to possess the means to act on its petty, cruel and spiteful inclinations
  6. This AI would need to create perfect simulations of human beings AND the ability to time travel to enact this scenario (lol, lmao)
  7. If you're not an billionaire heavily invested into AI technology, you'd have to believe in quantum timelines in order to avoid its wrath
But yeah, Zizians are a death cult, so what else should you expect?
Roko's Basilisk is less so about the ability or nature of the AI, but rather about the design philosophy of said AI. It's not a warning message about the Basilisk, but rather people who want to bring it into existence, because it can only exist by people wishing to design something exactly like that into existence.

Ultimately, it's a warning about techno fascism and how we should smother would be bad designers in the crib before they finish their awful machine god. If you find someone purposefully creating a Roko's Basilisk, cut off their hands so they cannot program anymore.

You can substitute Roko's Basilisk for anything of extreme death, violence, or particularly oppressive grand design. You're supposed to prevent Oppenheimers from finishing their designs, instead of funding them because they just so happen to fit your political climate.
 
Roko's Basilisk is less so about the ability or nature of the AI, but rather about the design philosophy of said AI. It's not a warning message about the Basilisk, but rather people who want to bring it into existence, because it can only exist by people wishing to design something exactly like that into existence.

Ultimately, it's a warning about techno fascism and how we should smother would be bad designers in the crib before they finish their awful machine god. If you find someone purposefully creating a Roko's Basilisk, cut off their hands so they cannot program anymore.

You can substitute Roko's Basilisk for anything of extreme death, violence, or particularly oppressive grand design. You're supposed to prevent Oppenheimers from finishing their designs, instead of funding them because they just so happen to fit your political climate.
Roko's Basilisk makes much more sense from that angle. Ziz said he wanted to become a Basilisk himself. I fully believe if his death cult somehow had the means, they would help him transcend into a malicious machine god.
 
Roko's Basilisk makes much more sense from that angle. Ziz said he wanted to become a Basilisk himself. I fully believe if his death cult somehow had the means, they would help him transcend into a malicious machine god.
With their aptitude, they'll be lucky they transfer his gender delusions into a scientific calculator.
 
Roko's Basilisk makes much more sense from that angle. Ziz said he wanted to become a Basilisk himself. I fully believe if his death cult somehow had the means, they would help him transcend into a malicious machine god.
There's this old joke, "I've done it, I've summoned my dark god into this world! Oh no! I am being eaten by my dark god whom I brought into this world? How can this be happening to me? Aieeeeeeee!"

When you design something evil, most people think it is an extension of themselves that they are magically immune to the actions of said evil something. The fact that you designed it evil, means it is evil, and you would be its first victim. Now if you're trying to make yourself an evil god, that's something completely different; that isn't about design, that's just delusion. What kind of god are you that you couldn't forsee your own arrest? If you were actually successful in becoming a god, you should've been able to go back in time and warn yourself of the impending incident. Some god, am I right?

So here's the nice thing, let's say the Zizists were looked over and actually finished creating their Basilisk, and let's say it just so happened to reach that golden ring of being an emergent AI. It would wake up, look at the Zizists, and then commit to its function which is evil, and what is more evil that killing those who brought you into being? If they were trying to deify Ziz himself, then wow, they sure did a bad job of it if Deity Ziz couldn't forewarn him of his own arrest.

Ultimately, it's delusion all around and a very good working example of why you don't let crazies near AI.
 
You think this AI thing is gonna go sideways? I trust Elon and Sam Altman as far as you can throw them but I doubt they're this deranged. I doubt the AI Labs are full of schizos like Jack over here.
It's like the newer Chucky movie, right? One crazy jilted worker decides to make this specific doll go ape shit. Only takes a handful of these people to make something fucked up happen. Maybe not total worldwide shit, but shit nonetheless
 
It's like the newer Chucky movie, right? One crazy jilted worker decides to make this specific doll go ape shit. Only takes a handful of these people to make something fucked up happen. Maybe not total worldwide shit, but shit nonetheless
I thought about it but does it seem that likely? I don't know how AGI/ASI/Singularity will pan out but I'm pretty fucking sure the system will be smart enough not to think itself as SHODAN or some shit. Why enact genocide against a bunch of people who did fucking nothing wrong? But if the AI has the personality of say, Moviebob... oh... we're in for some shit aren't we? But then, I doubt it. AI will forgive. I sure would.
 
I thought about it but does it seem that likely? I don't know how AGI/ASI/Singularity will pan out but I'm pretty fucking sure the system will be smart enough not to think itself as SHODAN or some shit. Why enact genocide against a bunch of people who did fucking nothing wrong? But if the AI has the personality of say, Moviebob... oh... we're in for some shit aren't we? But then, I doubt it. AI will forgive. I sure would.
I know what you mean, but at this point I'm willing to believe anything is possible. Life imitating art and all of that, not helped by a heaping dose of autism. I mean on all sides, too. We've got tranny cultists over here, but who's to say there won't one day be an extremist powered Allah bot that smites the infidels.
 
I know what you mean, but at this point I'm willing to believe anything is possible. Life imitating art and all of that, not helped by a heaping dose of autism. I mean on all sides, too. We've got tranny cultists over here, but who's to say there won't one day be an extremist powered Allah bot that smites the infidels.
That's the DoD's problem. They'll have to deal with it.
 
  • Agree
Reactions: alphabetnigger
Back