GPT-4chan - The AI is pretty sentient!

  • 🔧 At about Midnight EST I am going to completely fuck up the site trying to fix something.
A bit late to the party, I know, but I plugged in a fun first post as a prompt and it generated some pretty good chatter.
tay_lives.png

Heh. Neat. I need to figure out how to run this (or something like it) locally somehow to keep it alive and/or use it for completely innocent and wholesome purposes when this current version inevitably gets shut down.
 
That AI is much less harmful than the posters who created the original posts.

There is no anonymity if you connect to 4chan using a Silicon Valley designed processor.

The "facts" that wannabe shooters are fed there are highly tailored to what they are predisposed to believe already, because the ones posting have complete surveillance of everyone (including of you who reads this - you can thank Eric Schmidt) and know exactly what to post to create a shooter.

Silicon Valley has blood on their hands. 4chan is just one of the places used for these operations. Taking it down doesn't matter, because as long as Silicon Valley continues to spy on everybody and give the data to terrorists, innocent people will continue to be murdered.
 
  • Like
Reactions: Shard and Lorgar
A bit late to the party, I know, but I plugged in a fun first post as a prompt and it generated some pretty good chatter.
View attachment 3353406

Heh. Neat. I need to figure out how to run this (or something like it) locally somehow to keep it alive and/or use it for completely innocent and wholesome purposes when this current version inevitably gets shut down.
Did you ever figure out how to run it locally? I got the torrent, but can't figure out how to run it locally.
 
  • Thunk-Provoking
Reactions: chiobu
Did you ever figure out how to run it locally? I got the torrent, but can't figure out how to run it locally.

Maybe this from his repo might help a bit: https://github.com/yk/gpt-4chan-public

Code for GPT-4chan

Note: This repository only contains helper code and small changes I made to other libraries. The source code to the actual model is here at https://github.com/kingoflolz/mesh-transformer-jax/

Data here: https://zenodo.org/record/3606810

Model here: https://huggingface.co/ykilcher/gpt-4chan

Website here: https://gpt-4chan.com

Also, I will not release the bot code.

I'm not at all familiar with the technical aspects of AI/ML but I'm guessing you might need a very powerful machine too?

On a somewhat related note, rdrama.net has been running a GPT-2 (or 3? I can't remember) bot that has actually baited many users into internet slapfights lol. NGL I couldn't tell for a couple of days until it tagged itself in a conversation, but it's still pretty impressive IMO

 
  • Like
Reactions: A Friendly Alpaca
Maybe this from his repo might help a bit: https://github.com/yk/gpt-4chan-public



I'm not at all familiar with the technical aspects of AI/ML but I'm guessing you might need a very powerful machine too?

On a somewhat related note, rdrama.net has been running a GPT-2 (or 3? I can't remember) bot that has actually baited many users into internet slapfights lol. NGL I couldn't tell for a couple of days until it tagged itself in a conversation, but it's still pretty impressive IMO

Thank you! I'll check it out.
 
  • Like
Reactions: chiobu
Thank you! I'll check it out.
Please be aware that you will need 32 gb of system memory if you are running the model with CPU inference, because the model weighs are approximately 22GB. also, if you are running the model with a GPU, the GPU will need to have more than 11GB of memory, to hold the half precision version of the weights.

Moreover I do not condone this model being used to operate bots on this website or 4chan.
 
Please be aware that you will need 32 gb of system memory if you are running the model with CPU inference, because the model weighs are approximately 22GB. also, if you are running the model with a GPU, the GPU will need to have more than 11GB of memory, to hold the half precision version of the weights.

Moreover I do not condone this model being used to operate bots on this website or 4chan.
Thank you, I would not operate bots to post here or on 4chan. Mainly I would want to make a chatbot to talk to itself or to simulate 4chan conversations for personal use.

It sounds like you are very knowledgeable - I would be using a machine with an intel i9-12900K 16 core 3.2ghz processor, an rtx 3090 ti with 24 gigs of gddr6x, and 64 gigs of ddr4 3600 mhz cas 18 ram.

Which of the two models would you suggest running with this hardware configuration?

I appreciate the information you've given me, and thank you for taking the time to help a stranger on the internet. :)
 
  • Like
Reactions: chiobu
Thank you, I would not operate bots to post here or on 4chan. Mainly I would want to make a chatbot to talk to itself or to simulate 4chan conversations for personal use.

It sounds like you are very knowledgeable - I would be using a machine with an intel i9-12900K 16 core 3.2ghz processor, an rtx 3090 ti with 24 gigs of gddr6x, and 64 gigs of ddr4 3600 mhz cas 18 ram.

Which of the two models would you suggest running with this hardware configuration?

I appreciate the information you've given me, and thank you for taking the time to help a stranger on the internet. :)
Yes, your machine is able to run the model. I am a member of a machine learning discord server, which is very helpful for people who would like to learn, and also has many academicians in it, should you decide that you want to learn how to design such systems. I for example work on the application of machine learning to legal texts.

 
Yes, your machine is able to run the model. I am a member of a machine learning discord server, which is very helpful for people who would like to learn, and also has many academicians in it, should you decide that you want to learn how to design such systems. I for example work on the application of machine learning to legal texts.

Thank you again Endo. I signed up and will have a look around when I get a bit of free time.
 
  • Like
Reactions: chiobu
Back