I really appreciate this. It sounds like you guys/gals/'gals' know your stuff. I'm a little bit of a retard when it comes to modern tech. Would these programs be available to download on Pirate bay and run on a less-than-average PC? I don't mind the wait, my first computer as a young-un was a commodore and the tape loading was nothing that bothered me. Neither did the 30 minute downloads for music files back in the limewire+56kbps internet days.
If I can chuck a question/query into the (search?) box and let it do its' thing while I do other tasks, that's fine for me.
The good thing about most AI models, is that they're hosted openly without downloading torrents from PB or the like. A quick search on
HuggingFace with the model you want to use is enough. You have to download the hefty files, which depending on the model size, they can reach several gigs in size.
However, self-hosting AI models is not like installing most software, you need two things to use one: A backend and a frontend.
A backend is like a virtual server that reads the LLM file and boots the AI model into your machine. A frontend is the GUI where you interact with the model, think it like opening Grok's website but on your computer.
A good, beginner-friendly backend is KoboldCPP, which is already prebuilt into a windows executable, ready to be booted up. From there, you can tweak with the running model as much as you want.
Oogabooga's GUI (yes, that's the name of the creator of the interface) acting as a frontend for your image and text generators is a good choice, but it has more options to fiddle with than a space shuttle. SillyTavern's front end is better suited for beginners, and there is a lot of tutorials floating around that dwells deeper into setting up everything.
I was in your same position, but turns out that my shitty laptop with integrated AMD graphics and 8GB of RAM can't handle anything without crashing. 16 GB RAM with at least 1GB of VRAM is needed to at least run LLMs locally, so be aware of that.