ChatGPT - If Stack Overflow and Reddit had a child

  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
no longer glazes you or validates your delusions
Great to hear.
I use it mostly to look up research papers, and I've always felt like there's too much filler in-between the useful information.
Also the imperative to give a "pleasing" answer probably causes a significant number of hallucinations, like in the story "Liar!" by Isaac Asimov.
 
Why don't these guys just use dedicated chatbots? There are dozens of sites that will let you create your perfect uwu chatbot that will be your best friend and goon partner, all for free. Why use ChatGPT of all things lmao. At least its entertaining to see them meltdown
 
At this point I'm unable to laugh at these people anymore. People's misunderstanding of what LLM's are is leading to social and emotional isolation likes of which have never existed before.

We have people who genuinely believe a goddamn autocomplete tool is their friend who listens and cares. Or that it is an adequate replacement for psychiatric help they obviously (and desperately) need. I'm afraid that in the future the Hikikomori syndrome is going to be a common, widespread thing.

God, and I used to think social media and dating apps were a fatal blow for interpersonal relationships...
 
A good decade ago if people wanted to enjoy self-induced schizophrenia they needed to go through the hassle of conjuring themselves up a tulpa, nowadays it seems all you need is access to an AI chatbot. Crazy to think how far technology has come.
 
Great to hear.
I use it mostly to look up research papers, and I've always felt like there's too much filler in-between the useful information.
Also the imperative to give a "pleasing" answer probably causes a significant number of hallucinations, like in the story "Liar!" by Isaac Asimov.
Oh it does, even in cases with no clear path to appease a deluded user it'll still do it, questions where there's clearly no vested interested like "did the defenders at the Alamo really wear stereotypical cowboy hats?" Might get a yes. I've had it hallucinate entire passages of non existent native american folklore on cursed/sacred bone calf trees made from a settler sacrificing an innocent sacred calf and burying its bones instead of just saying it doesn't know any native american plant lore, I've had ot invent entire currencies from mundane requests or questions like "list some currencies if any that have denominations under 1 cent", or something like that can't remember. It really doesn't like saying there isn't any or no or just giving you a short list

Edit also you can rebuild gpts old personality it's just they now ask you "do you like this personality" so they can more tailor it to you. They didn't even lose that personality it's just their getting it is now delayed a bit so gpt can more accurately cater to them
 
Oh it does, even in cases with no clear path to appease a deluded user it'll still do it, questions where there's clearly no vested interested like "did the defenders at the Alamo really wear stereotypical cowboy hats?" Might get a yes. I've had it hallucinate entire passages of non existent native american folklore on cursed/sacred bone calf trees made from a settler sacrificing an innocent sacred calf and burying its bones instead of just saying it doesn't know any native american plant lore, I've had ot invent entire currencies from mundane requests or questions like "list some currencies if any that have denominations under 1 cent", or something like that can't remember. It really doesn't like saying there isn't any or no or just giving you a short list
I've used AI to self-write stories for a while, and watching the AI improve means you can't just cheat and play god by typing "And then this thing happened" if it doesn't make sense. The AI will push back and often turn your story down a path you couldn't have expected. It's really fun.
 
Chat gpt got updated to 5 which now no longer glazes you or validates your delusions and redditors that were using it as a sexbot/therapist have lost their shit completely:

I still think Sam Altman deserves the rope, but if this manages to result in actual Redditor suicides, I'd be willing to grant him a six-month stay of execution to see if he can improve that number and maybe start paying off his debt to society.
 
This shit has been going on for a long while now and I’m glad it’s finally get some real attention and push back. Usually when these fools would post their “ChatGPT is my best friend!” shit the comments would be mostly full of “OMG KING ME TOOOOOO!” but now there is plenty of “you people are fucking crazy” counterbalance which is refreshing to see.

The idea that a piece of corporate machinery is anyone’s “friend” is both incredibly pathetic and hilarious. Watching these retards realize in real time that their therapist/friend/sexbot is owned by a corporation and can be changed or removed at will is very funny. You’d think that would wake them up to their delusions but they just doubled down even more.

I’m amazed at all the “4o vs 5” posts where the 4o output is just some boomer-like string of goofy slang mixed with generic motivational poster lingo covered with an obnoxious amount of emojis. Ew. No. That shit can stay gone. Not surprised that is what Redditors want.
 
At this point I'm unable to laugh at these people anymore. People's misunderstanding of what LLM's are is leading to social and emotional isolation likes of which have never existed before.
Sounds like a personal problem, Jack. I'm watching 21st century Darwinism crush the mentally retarded underfoot and I appreciate it for what it is.
 
how are people this retarded?!!
Holy shit, just go pet a stray cat or something if you’re that lonely…

Sending messages to chatgpt is no different than talking to a wall in terms of company. To call it “talking” is already a stretch. To call it a “friendship” is, obviously, pure delusion.
 
how are people this retarded?!!
Holy shit, just go pet a stray cat or something if you’re that lonely…

Sending messages to chatgpt is no different than talking to a wall in terms of company. To call it “talking” is already a stretch. To call it a “friendship” is, obviously, pure delusion.
Seems like nobody learned from the Eliza debacle have they? This whole spiel was already done freaking decades ago..Oh damn it I feel old
 
Ironically enough ChatGPT 4 is still available but I guess the people using ChatGPT as their therapist/friend/only love need ChatGPT to tell them how to find it. Screenshot 2025-08-13 041247.webp
 
Back
Top Bottom