Programming thread

  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
I don't wanna bother catching every quote I saw in the last few pages, but: x86 is more than perfect to learn from. its odd that people are speaking against it at all. took C++ 1 and HTML/CSS my first semester; and x86 along with C++ 2 & DSA for my second. since then, I've read gameboy code with no issue, and have been capable of reading Cheat Engine's generated asm output for executables (granted, that one is in x86 still) with little to no issue.

hell, I don't know why people make such a big deal about learning one OS' functionality, and fetishize abstraction so much.
Oh wow, I only know how to page memory in windows thanks to VirtualAlloc... oh wait, I can instantly google search the mmap equivalent for Linux (yes, even before LLMS forced their way into search engines). And at what cost do we have these abstractions in high level languages such as C/C++? the more I read through Operating Systems: 3 Easy Pieces, and continue to learn the win32 api, the more I see that the C suites language features were the beginning of the end for what people bitch about Java/Rust/Python/etc now. Yea, x86 could do with some convenience features to decrease how verbose you have to be, but we have niggas literally stating that you can 'return' multiple variables via a struct (when that isn't even true) when x86 LITERALLY lets you """return""" as many values as you can through as many available registers you have at your disposal. LIKE GODDAMN NIGGA, NO ONE KNOWS SHIT ON WHAT PROGRAMMING EVEN IS ANYMORE. I honestly find Python devs who make impressive shit in their shitty language more respectable than C/C++ devs (and I primarily code in C++). Hell, I remember seeing early C code from 20+ years ago that basically looked more like assembly than what exists now.

/schizo (somewhat) TL;DR: Your learning path should start with either C++ or Java due to how much info there is out there for starter to intermediate projects, and cos you should get used to writing modular code (not Clean Code™ , modular). Then, learn x86 since its pretty straightforward. ANY language after that is a complete cakewalk (anyone who hypes COBOL/FORTRAN as uber hard languages are retarded niggers who were ngmi). I have never looked at NAND2TETRIS (plan to eventually), but learn x86, and you will easily understand any other asm. you genuinley have to have a sub 100 IQ to get confused at any implementation of ASM code if you have at least learnt ONE (and same goes for compiled languages if you learn C++/Java).

Also, I love vim/neovim hotkeys/modes and whatnot, but stick to Visual Studio (NOT CODE, 2022). Anyone serious needs to learn to properly debug programs - PRINT DEBUGGING IS RETARDED - and until Ryan Fleury is done with his debugger (and assuming its any good), Visual Studio has the most visually intuitive debugger out there. Also, learn how to properly load libraries from a bat file; or better yet, from the LoadLibrary function or whatever the Linux equivalent is if yer on there.
 
x86 LITERALLY lets you """return""" as many values as you can through as many available registers you have at your disposal
"And brother, let me tell you about the STACK POINTER..."

I disagree with you about debugging, though. Print debugging is almost entirely adequate, and the few times it isn't, gdb is as effective as anything MS released, provided you spent the time to learn it. These are, naturally, my opinions, but given that you grasp the nature of programming well, the reality is that you can program your solution to whatever your need is for debugging output. The canonical "solved by programming" example of "debugging via programming" is developing tests. I haven't written a bug that trivial testing wouldn't have prevented in years. You like your MS tools. I'm not disrespecting them, but I don't use them. I don't like being beholden to software I can't just hop into and resolve whatever aggravation I've encountered. But the thing about programming is that you can program your own tools, which can really make debugging a shallow problem. On the other hand, you can't use MS tooling to debug the Linux kernel, which is to say that any debug tooling has its limits.
 
"And brother, let me tell you about the STACK POINTER..."

I disagree with you about debugging, though. Print debugging is almost entirely adequate...
I won't call you a nigger since you're being overly nice to me (I wouldn't call myself anything other than Intermediate at the student level when it comes to programming), but I think that debugging is absolutely pivotal to programming education (which I think is what I was responding to?). Getting used to the fact that you are working with memory at every point in time is absolutely necessary to free yourself from the niggerlicious shackles that IS modern programming abstraction. Hell, whether its new or malloc, I think that we have gone too abstract when it comes to programming education. sure, std::vector is kinda convenient, but tuples, dictionaries etc literally are abstractions that patch over features that already EXIST -- struct is already a multidimensional array, why do you need a dict at all?

plus, the more I get into math-y stuff like graphics programming, and have random crashes when I move my first-person camera to a certain angle, I can learn at what intervals the tangent function has effectively borked my program in less than 10 seconds, especially thanks to the absurdly easy to utilize stack tracing (which is a godsend).

Also, I wouldn't say I love my MS software, just that msvc (pragmas for instance, (which I'm pretty sure gcc and clang don't have)), and the vs debugger provide great entry level tools. and while I prefer the vs debugger in this case, I do not fault anyone for using a command line debugger, its just that a new student will find that needlessly daunting (interacting directly with memory is also daunting, but is necessary). I vastly prefer vim/neovim as a text editor, and I love feeling like le ebic haxxor editing code in the console, but I just need my debugger.
 
I'm talking the big boy LLMs that would be like chatGPT but if it wasn't hypercensored and could be modified, provided youhas a rig with sufficient GPU power
Get yourself 4 3080s (from China, you can get 20GB models for about $400~ a piece) and get yourself some decent DDR (96GB minimum) with a higher core count Ryzen. With that you could run GPT-OSS 120b and most of the open source models. You could cobble something together for less than $3k… although uh, hopefully power is cheap.

I’m not sure if there’s an uncensored version of GPT-OSS but give it time and someone will release a model.
 
Get yourself 4 3080s (from China, you can get 20GB models for about $400~ a piece) and get yourself some decent DDR (96GB minimum) with a higher core count Ryzen. With that you could run GPT-OSS 120b and most of the open source models. You could cobble something together for less than $3k… although uh, hopefully power is cheap.
One problem is the lack of PCIe lanes on consumer Ryzen. Obviously you can go Threadripper or Epyc.
 
One problem is the lack of PCIe lanes on consumer Ryzen. Obviously you can go Threadripper or Epyc.
Ah damnit you’re right. I forgot about that in the calculation. With bifurcation this could work?

  • AMD Ryzen 9 7950X3D CPU – $580
  • ASUS Pro WS X570‑Ace motherboard – $350
  • NVIDIA GeForce RTX 3080 20 GB GPU × 4 – $2,200 (4 × $550)
  • 128 GB DDR4‑3600 RAM (4 × 32 GB kits) – $320
  • 1500 W 80 PLUS Gold power supply – cheapest ~$400
If you already had some old hardware lying around (like I do) you could get a cheap nigger-rigged box for ~4k.

With this build you would have one of the GPUs bottlenecked with half the lanes
 
>just drop 5k to nigger-rig an AI box

For what purpose?
I don’t know what Jotch would use it for but I have a very specific use case (personally the only thing I believe LLMs are almost necessary for).
Think 1600s OCR’d Latin books -> multiple rounds of varying literal/thought-for-thought translation -> English.

I’ve done this to great success with one book (600 pg.), very readable and upon asking experts it’s pretty close. The only issue is the models i have to use must fit within 20 GB VRAM and that can mean a loss of context which leads to a lot of iterations.

In summary, machine translation on crack.
 
Good hardware suggestion! My biggest concern is the availability of high quality LLM models, but maybe GPT-OSS is good and open.
Also I'm concerned with the availability of software that can run the thing.

Have Tensor cores found their way into the markets? And does any software make use of them?

What software do you suggest using to run any given model on local hardware on linux?

>just drop 5k to nigger-rig an AI box

For what purpose?
to sneed
 
Have Tensor cores found their way into the markets? And does any software make use of them?
I believe PyTorch explicitly makes use of them but take that with a grain of salt. Any GPUs I’ve seen that boast anything TPU related have seemed ridiculously expensive (new car or GPU type shit) but LLMs don’t require them if you are ok with waiting a little bit longer for tokens to gen.

What software do you suggest using to run any given model on local hardware on linux?
I’m a nigger so Ollama with a bigger context window is all I use. Run it on my Arch (btw) box when doing translation passes.

EDIT: forgot to respond to the first query.
> asking about software availability and is it any good

I would recommend installing Ollama on any device with a dedicated GPU and download the “entry-level” models to see just how usable they are. There is a project called “fabric” which allows you to pipe from CLI to the LLM, it’s very useful once you get the hang of it.
With a 3080ti and 64 gb of ddr you can run qwen3:30b and that model is great imo

Just get your feet wet and don’t fret if sometimes I just does a stupid thing, usually it’s because you exceeded its context window and it gets lost (if you give it large files it will puke)
 
Last edited:
struct is already a multidimensional array, why do you need a dict at all?
Why do you need runtime value to value association container when there is one that does symbol to value at compile time?
Also what do you mean by struct being a multidimensional array?

I guess tuples in C++ suck, since they are just structs with unnamed members, and structs support structured bindings anyway.
In other languages they are convenient sometimes and have their uses.
 
/schizo (somewhat) TL;DR: Your learning path should start with either C++ or Java due to how much info there is out there for starter to intermediate projects,
Java in particular seems to be a lingua franca for a lot of texts on data structures and algorithms as well as software architecture
Also, I love vim/neovim hotkeys/modes and whatnot, but stick to Visual Studio (NOT CODE, 2022).
Sorry but I'm going to keep using Neovim
I disagree with you about debugging, though. Print debugging is almost entirely adequate, and the few times it isn't, gdb is as effective as anything MS released, provided you spent the time to learn it.
print() can be effective and it's a habit of mine, but it's a bad habit. It's most effective when your programming language has the means to print most data structures effectively. Even then, there's a chance that you will have to debug code that's not yours, but was installed system-wide. Do you want to edit that code with root privileges (and then undo your edits later) or use the debugger? And of course when it comes to something like C you pretty much have to use the debugger if you're not a masochist.

protip: Python has an advanced drop-in replacement debugger called pdbpp
 
Print debugging is almost entirely adequate,
For local debugging yes. When you need to diagnose a problem on a server you don't even have access to and have to ask some guy in another company to do the deployment on your behalf.... (I have sent my resignation just now, still an employee until the end of the year, no idea what, if anything, my boss replied to it with.)
 
When you need to diagnose a problem on a server you don't even have access to and have to ask some guy in another company to do the deployment on your behalf....
this is completely evil and no debugging technique will help here
except for the real man's debugger: tracing through the (paper) code listing in your head\
if you can't see the program counter dancing around in your head you're just not white

this technique synergizes beautifully with black-box debugging, where you use gut feelings, tea leaves, astrology, and dartboards to figure out the exact nature of the bug when the previous method stops being practical
 
abstraction
literally are abstractions that patch over features that already EXIST
Literally everything is arguably an abstraction of a sort. Hell, all of mathematics is an abstraction of natural properties n shit.

Abstracting, zooming out conceptually, composing smaller interoperable components/systems into larger, more complex components/systems, etc. is literally how everything around you works. This is what students need to really understand.
You can keep conceptually zooming out seemingly forever, and that's what scientific research does, in essence.

I'll end with saying something that would start a pretty fiery semantic debate in a room of CS profs: a Turing machine is an abstraction over lambda calculus (and the natural properties it describes/models).
 
I'll end with saying something that would start a pretty fiery semantic debate in a room of CS profs: a Turing machine is an abstraction over lambda calculus (and the natural properties it describes/models).
you can do lambda calculus as an abstraction over turing machines, but whenever you do this it is called a "scheme implementation" or an "ml compiler"
 
this is completely evil and no debugging technique will help here
except for the real man's debugger: tracing through the (paper) code listing in your head\
if you can't see the program counter dancing around in your head you're just not white

this technique synergizes beautifully with black-box debugging, where you use gut feelings, tea leaves, astrology, and dartboards to figure out the exact nature of the bug when the previous method stops being practical
It's a public company. They literally aren't allowed to give us any more access. And we only started doing deployments quite late into the project because we couldn't get any access earlier. So now the project is about half a year overdue already.

We should've started testing much earlier and in a much more aggressive manner. But noooooo, gotta work through the beaurocracy first.

I'll end with saying something that would start a pretty fiery semantic debate in a room of CS profs: a Turing machine is an abstraction over lambda calculus (and the natural properties it describes/models).
Neither is an abstraction of the other. They are equivalent in terms of what they can do, but neither is a representation of the other.
 
Neither is an abstraction of the other. They are equivalent in terms of what they can do, but neither is a representation of the other.
you can represent one using the other fairly easily, if you just treat this lump of lambdas as a number and this other lump of lambdas as some other number on the tape...
they are both actually abstractions of each other, and that's a Good Thing™ (but you don't have to treat them that way if you don't want to)
 
Foremost: Lambda calculus is the calculus (how shit changes over time) of function applications.

you can represent one using the other fairly easily, if you just treat this lump of lambdas as a number and this other lump of lambdas as some other number on the tape...
they are both actually abstractions of each other, and that's a Good Thing™ (but you don't have to treat them that way if you don't want to)
The way I see it, the Turing "tape" setup feels more like a description of how the mathematical/natural concepts can be applied in practice (like with machines and registers and shit). Church's lambda calculus feels closer to a direct model of whatever natural universe properties make all this shit a thing. I'm a lil sleepy rn, so please forgive my lack of eloquence here lol.
 
Back
Top Bottom