reddit General

  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
I can't imagine how mind-numbingly boring someone's life would have to be for them to consider something as lame as this "abuse". This study is a basis for an interesting research paper at best and a minor inconvenience at worst, so it baffles me how some of these people are acting like they got mindraped by a robot. It's fairly obvious to most with half a brain at this point, but this entire fiasco further cements the idea that most Redditors will catastrophize and overblow any event they can involve themselves in, no matter how insignificant it seems, if it means they'll get some sort of cathartic high from victimizing themselves.
 
There are worse things coming. Propaganda will be eventually so interwoven, so sophisticated and precise that this will be seen as amateurish in comparison.

As Russell noted-there is a real science to manipulating and generating public opinion. And like any science-it gets more precise and accurate over time.
 
When you think about it, it kinda makes sense. An AI is just a collection of raw data, figures and information assembled in the most coherent format possible. It makes sense that someone would be more persuaded by say, opening a book with a list of information and talking points. than talking to a person with a different subjective interpretation of said talking points.
 
One of the things I saw was how the bots actually observed the posting histories of the people they were trying to convince.

Personalized propaganda-imagine everything you’ve ever written online, being absorbed into a profile that is then used to accurately gauge your beliefs, personality, weaknesses and what would be most effective at changing your opinion.

Be aware that anything you post-is being used to get a better understanding of you so the TPTB can better lie to you.

That I find genuinely frightening.

BTw here’s the link to what I’m referring too.

 
When you think about it, it kinda makes sense. An AI is just a collection of raw data, figures and information assembled in the most coherent format possible. It makes sense that someone would be more persuaded by say, opening a book with a list of information and talking points. than talking to a person with a different subjective interpretation of said talking points.
A quick skim of the big post made https://www.reddit.com/r/changemyview/comments/1k8b2hj/comment/mp50lhg/

Some high-level examples of how AI was deployed include:

  • AI pretending to be a victim of rape
  • AI acting as a trauma counselor specializing in abuse
  • AI accusing members of a religious group of "caus[ing] the deaths of hundreds of innocent traders and farmers and villagers."
  • AI posing as a black man opposed to Black Lives Matter
  • AI posing as a person who received substandard care in a foreign hospital.

Looks like it wasn't actually data or real arguments convincing people but arguments from authority or appealing to emotion. Exactly like how the news does it.
 
An element of the Turing test that often gets overlooked is the tester is supposed to be someone intelligent capable of performing one, not a nonfunctioning retard who would get in an argument with a '90s era babblebot.
Back in the old days of usenet we had .advocacy groups, such as com.sys.amiga.advocacy.
These groups worked really well because all the really turbo-retarded faggots kind of self-segregated to these groups keeping the rest of comp.sys.amiga.* groups relatively sane and free or turbo-retards.

Everything old is new again.
Maybe we need some social medial in that style where all the really really mentally ill can self-segregate into and keep the rest of social media for normal people?


Halfway through writing this post I realize we have such a site already : Bluesky but it is not effective in containing the madness on the site and their insanity spills over into the rest of the internet. Imperfect containment :-(
Perhaps Bluesky needs a lot of AI bots there that can keep them occupied so that they don't have time to go off-site and pollute our spaces?
 
these are the drybrains smugly chuckling to themselves how everyone else is stupid and manipulated by russian bots
 
A quick skim of the big post made https://www.reddit.com/r/changemyview/comments/1k8b2hj/comment/mp50lhg/



Looks like it wasn't actually data or real arguments convincing people but arguments from authority or appealing to emotion. Exactly like how the news does it.
Oh. that's extremely embarrassing, and exactly what I'd expect for Reddit.

Literally all they had to do was punch in the right keywords to get the right results. Redditors are the shittiest chatbot of all.
 
Leave it to Reddit to be so easily swayed that they change their beliefs based on some random online posting. Also leave it to Reddit to get mad when they don't like WHO did the posting that somehow caused them to change their beliefs.
 
STOMACH TURNED.webp
The Beep-Bop did this because feeble-minded Redditors tend to take a statement of identity as a statement of authority on a matter.
A chatbot using simple manipulative language that tends to work on histrionic retards can play them like a fiddle.

This would be a loss of reputation, if Redditors had any to lose.
 
Almost definitely if a university is admitting they did this study, corporations and governments(spooks) have done studies without ever informing the public likely with far larger sample sizes and across far longer timespans-producing far more accurate and useful results.

In all likelihood there is not a single mass platform that hasn’t been subjected to this. 4chan, Twitter, Facebook, Plebbit, and the rest.

Smaller fora like Kiwifarms-this doesn’t work as well. As users build reputations, and you’d basically need multiple bots working in an environment where manipulation is much harder to disguise(if not impossible).

Everyone here has almost definitely talked too, read or otherwise engaged with a bot(or a human researcher/influencer trying to gauge your beliefs or manipulate them or both).

If you don’t want to be even open to manipulation-you basically need to live as a hermit now in the wilderness with no internet.
 
That’s true. My go to solution is constant self reflection and dialogue. Always ask yourself if what you believe is true, how long you’ve believed it and how well it accords with reality.

You can never be 100% confident you haven’t been manipulated somewhere into believing something false, but you can at least do mental spot checks to prevent full NPC-dom.

Another rule of thumb I would add is that the instant your opinion aligns with the majority--its time to rethink.

Humans like to fit in, and will do anything to not be pariahs and be outcasts. Blame evolution. Sometimes majority opinions are harmless. It’s not unreasonable that many people will agree on a self evident truth, or even enjoy something popular.

Or maybe your evolutionary instinct to be part of a group was hijacked and you only believe something because most people do. And so do they. But that idea was implanted by someone who is trying to manipulate you, and didn’t occur naturally as you thought.

Often most people are infact wrong.
 
Last edited:
This is the most pathetic internet shit I've seen in a while. Redditors will literally just decide to be outraged over anything. They will then spend the next few days concocting an entire backstory for why they're furious, and will die on that hill defending their newfound, arbitrary opinion.
 
A group from University of Zurich used /r/ChangeMyView to run a study which involved using large language model bots (archive)
View attachment 7282831
Redditors, angry that their comments in circlejerk about the current thing subreddit #9845734 could easily be replaced by robots, suddenly become experts in AI and research ethics or something.
Using this logic placebo groups should be told they're being given the placebo in medical research because transparency 🧠
View attachment 7282839
Not the heckin' publicorino! No, I'm not mad that reddit and other companies do this same sort of testing all the time without anyone's knowledge or consent. You can't use gmail... because you just can't!
View attachment 7282840
The word "community" has become completely meaningless, example 105943022345
View attachment 7282841
No bad actors have ever thought to bot reddit before!
View attachment 7282842
WON'T SOMEONE THINK OF THE SUBREDDIT RULES + "I am not a lawyer" (trust me buddy, we all already knew)
View attachment 7282856
A LLM pretending to be PART OF A MARGINALIZED COMMUNITY (read: 85% of humanity). That's too far!
View attachment 7282860
A man who stands for nothing will fall for anything
 
Another rule of thumb I would add is that the instant your opinion aligns with the majority….its time to rethink.

Humans like to fit in, and will do anything to not be pariahs and be outcasts. Blame evolution. Sometimes majority opinions are harmless. It’s not unreasonable that many people will agree on a self evident truth, or even enjoy something popular.

Or maybe your evolutionary instinct to be part of a group was hijacked and you only believe something because most people do. And so do they. But that idea was implanted by someone who is trying to manipulate you, and didn’t occur naturally as you thought.

Often most people are infact wrong.
Its best to always have a bit of humble agnosticism about these things-especially for subjects you don't have either personal experience with or haven't actually studied-for example I spent my childhood reading history, and while I am not a professional historian-I have a far better grasp on the broad sweep than most people. So, if someone says "Napoleon fought in the 1848 revolutions with machine guns to kill Hitler"-I automatically know that's utter nonsense.

This isn't the case with say-quantum physics or botany. Someone could be bsing me on these subjects and I will readily admit I have next to no knowledge base to know they are. There's nothing wrong with being confident of your knowledge, but one should always be conscious of its limits.

"I'm honestly not that well read on X subject-maybe this is right, maybe its not"-there's no shame in admitting ignorance. Being honest about your limits means you won't be taken in by people exploiting either your fear or vanity.

One should also be honest about why one believes or disbelieves something-is it out of contrarianism, is there some emotional or irrational need driving your belief or lack thereof, is your identity tied to a belief that you want to be true?

This doesn't mean one has to self abase or stop believing-but no one is without their blind spots and weaknesses, so be honest at least to yourself.

And finally-be willing to actually think, use basic common sense, ask questions, and don't swallow something just because someone says it with either a credential next to their name or they say with it with confidence.

Its not foolproof, and in the twisting seas of constant information warfare you end up feeling exhausted for keeping the effort-but it will ensure you don't become a redditor.
 
Back
Top Bottom