How to get into AI?

George Lucas

Smooth-brained retard
kiwifarms.net
Joined
Jul 3, 2021
I’m thinking of dabbling with AI, but not sure where to begin. It looks like most AI is implemented in soylanguages like Python, but I’m more interested in high-performance aspects. Does that matter at this point?

Here is an idea I may want to pursue: developing an AI that can write programs (perhaps games and DSP programs, but potentially other areas).

I’m experienced with system and embedded systems programming, and also DSP. I can stumble my way around a graphics engine. I don’t really do webshit aside from getting sockets to work.
 
The Python is mostly glue, most of the actual compute is done in CUDA kernels.
 
First things first get yourself an Nvidia RTX GPU that has at least 12 GB VRAM, it seems to be the minimal requirement to do any training locally. As for writing programs, maybe try dabbling in chatbot training and study how code writing is handled on this front. It's a really rough guess if that would be even remotely helpful to you, so take that with a grain of salt.
 
  • Like
Reactions: George Lucas
I'd recommend getting a cheap 2u server and some teslas. Tesla p40 minimum. They're cheap, 24gb vram per card and the Cuda version is the minimum supported for llms.

I spent about 1800ish in total at the time and that's about the same as a fucking rtx. You don't need high end
 
First things first get yourself an Nvidia RTX GPU that has at least 12 GB VRAM, it seems to be the minimal requirement to do any training locally. As for writing programs, maybe try dabbling in chatbot training and study how code writing is handled on this front. It's a really rough guess if that would be even remotely helpful to you, so take that with a grain of salt.
Yeah I just got one which is why I’m interested in it. Apparently the card I got is getting marketed as an ‘AI-card’ which means it’s probably shit for games but somewhat acceptable for AI for the time being.
 
Get comfortable with the chain rule, newton's method, and some vector calc
 
It looks like most AI is implemented in soylanguages like Python, but I’m more interested in high-performance aspects. Does that matter at this point?
It does, because you're putting the cart before the horse TBH. Soy languages like Python already can be high performance because, as mentioned, areas where performance is critical are written in a mixture of languages like C, C++ and Fortran. I don't recommend pooh-poohing Python. It's suitable for beginners, but it's also suitable for experts. You can get a lot done in Python.

I would also recommend R. The language itself is pretty janky but since you mentioned substantial prior programming experience I don't think that will corrupt you. One of the key advantages with R is that it has widespread acceptance in academic research and there are various things you can do in R readily where a Python equivalent is either not obvious or will require substantially more effort. For instance, there seems to be a lot more you can do easily with meta-analysis in R vs. in Python. Data visualization in R has, in my experience, also been much less painful than in Python. So-called "data wrangling", too, and I would recommend looking into the principles of "tidy data", which apply just as well in Python as in R, or just about anywhere else.

Since you mentioned "high-performance aspects", obviously that exists too, because something like pure Python isn't suitable for crunching huge amounts of linear algebra code. But it's best for didactic purposes to learn from the tools that already exist.
Here is an idea I may want to pursue: developing an AI that can write programs (perhaps games and DSP programs, but potentially other areas).
DSP will almost certainly be the easier of the two, depending on the scope. If you can marshal a dataset of inputs and expected outputs that you would expect a human programmer to figure out manually, then the task is straightforward. Like, if you start with input audio and you expect an output where background noise is suppressed, then that's something machine learning can do fairly easily in the scheme of things.

Games are another story entirely. Do you intend to write machine learning code that is a strong player in a game or the like? Single- or multi-player? It's pretty open-ended.
I’m experienced with system and embedded systems programming, and also DSP
That's good news because you have a) substantial programming experience, b) probably a decent amount of sys admin experience to get all of that to work and c) some mathematical background, which most likely includes a fair bit of probability and statistics if you're using DSP.
I don’t really do webshit aside from getting sockets to work.
I don't think that's going to be a huge problem, but it depends on what you want to do. FWIW, there's a decent amount of data science code out there that will automatically generate front-end webshit code based on your data and what you want to do with them. Shiny is a good example.

Lastly, while the hardware recommendations for things like large language models mentioned in this thread are probably accurate, you can learn the fundamentals and still get a lot done with much more modest hardware.
 
On second thought I’ve decided to develop an AI that focuses on learning about every aspect of BIG ANIME TITTIES it possibly can.
Realistically though what would you like to create? I have a pretty decent amount of experience with machine learning and could give some pointers.
 
Use a crowbar or a screwdriver or something
do_not_fist_android_girls.png
 
I’m thinking of dabbling with AI, but not sure where to begin. It looks like most AI is implemented in soylanguages like Python, but I’m more interested in high-performance aspects. Does that matter at this point?

On second thought I’ve decided to develop an AI that focuses on learning about every aspect of BIG ANIME TITTIES it possibly can.


What do you want to do? I find it helpful to think of 'working in 'AI'' as different levels. Not necessarily on earnings potential but on how deep in the technical weeds you want to get.

Level 1: A user of 'AI': Just download or surf to a website and fire up Midjourney or ChatGPT. Plenty of people making money just doing this.

Level 2: Power user: Download an uncensored version of Stable Diffusion models. Learn about prompt engineering to get additional use out of ChatGPT or investigate alternative models like Claude or local LLMs. Plenty of money you could make here too and you have a head up over Level 1:

Level 3: Integrator: Integrates 'AI' into tools. Generally requires coding skills and knowledge of accessing APIs from you or someone working from you of course. But not anything too crazy.

Level 4: Finetuner: Creates custom modified versions of models. For the models that can be finetuned by private individuals you need a powerful and probably pricy but achievable amount of resources. Coding skills obviously to get anywhere deeper than surface level. These guys create LORAs of image generators and finetunes of local LLMs.

Level 5: Architecture Tinkerer/Small model trainer:
These guys can actually work on the base models themselves, modifying the different components and adapting them to different platforms. They are the ones releasing popular ML packages on to github like tortoise or llama.cpp. Some of the smaller models are trained directly. Requires through of not deep theoretical knowledge of deep learning, coding, and considerable although still achievable resources. Like a bank of 3090s and months of time to have them plugging away on a small project. Or at this point you could and probably many are at a good paying job as the head of 'AI' at some regular company or a grunt at an 'AI' focused or large company.

Level 6: Model Implementer: Mostly working professionally for companies building implementing and training medium to large models. Probably not too many of these folks.

Level 7: Algorithm and Architecture Designer: Although a lot of algorithm design takes place at the implementer level much of the general direction and strategy is developed in academic or big tech labs. Obviously requires deep and specialized knowledge of ML. Do research. Write papers. Get paid crap and remain anoymous for your new architecture responsible for half the company's income while the woke CEO is making orders of magnitude more than you flying around to Women in Tech awards ceremonies.

Level 8: Prominant Researcher/Field Pioneer: Create the overall landscape of the 'AI'/ML field with your breakthroughs. If you're senior enough you don't have to do any actual ML work anymore and just need to know enough to look smart as you jet around for talks and polish your Nobel Prize.
 
LLMs and all that shit are a dead end, they are incapable of cognition and that will never change. Study evolutionary biology, neurology and compsci.
82ee8-1uivmpmsso1ufpwm8gycuqq.jpeg
 
  • Thunk-Provoking
Reactions: George Lucas
LLMs and all that shit are a dead end, they are incapable of cognition and that will never change. Study evolutionary biology, neurology and compsci
Are you referring to neural networks inspired by biology directly? That could be interesting. However, the trend in computing is to take what is real and just run away with it. Genetic algorithms are a good example. They start with actual biology and introduce stuff like shuffling the alleles so that they solve tasks like the Traveling Salesman Problem. Think Lucas Werner but actually brilliant.
 
  • Agree
Reactions: Mr.Miyagi
LLMs and all that shit are a dead end, they are incapable of cognition and that will never change. Study evolutionary biology, neurology and compsci. View attachment 5846085

Are you referring to neural networks inspired by biology directly? That could be interesting. However, the trend in computing is to take what is real and just run away with it. Genetic algorithms are a good example. They start with actual biology and introduce stuff like shuffling the alleles so that they solve tasks like the Traveling Salesman Problem. Think Lucas Werner actually brilliant.

The Neural networks we use now were already directly inspired by biology. As time went on the priorities shifted from mimicry to doing what worked.
 
Back