Programming thread

"C For Dummies" copyright 2004.

It's a huge book with almost 800 pages...but about 60% in you're learning to write a simple program to translate English into Pig Latin. As normalfags would say, big oof.
This is actually not a bad project to teach the fundamentals of C. String handling is always a trial by fire and pig latin would require you to do some basic parsing which is also kind of clunky in C.

Writing a bonafide big-ass program in C is hard which is why no one does it and you mostly learn C to write very simple (but fast) libraries that higher-level languages use.

Also copyright 2004 means it's actually a relatively modern book by C standards. The book most people learn C from was published in 1988.
 
This reminds me of an article I read that was primarily about cha bu duo, but they touch on the importance of face.

I haven't worked with many chinese thus far. What was that like?
Imagine working with someone so autistic and so mumbly that you can’t even hear their ridiculous accent and you can’t see their slanty eyes because they avoid eye contact. Also they’re incompetent and won’t shut up about machine learning.
 
Oh boy, our regularly scheduled "first programming language" post.

You shouldn't read "C for Dummies" because
1) C is an old and finicky language that doesn't make things easy for you
2) "C for Dummies" is (probably) terribly written and outdated

If you like video games I assume you want to do something visual and interactive. Javascript is used to program web pages so you might find this more interesting than command line programs (which is all you will learn in a C book).

Programming is not about learning a programming language: the best way I can describe it is breaking down your goal into individual components that work together, and breaking down those components into well-defined step-by-step processes.
I agree with your final paragraph. But I disagree with what you said leading up to that. C is the basis for pretty much all modern software including newer languages. You might not get a job programming in C. But understanding how C works will teach you a lot of low level details about how computers in general work. If someone is serious about starting a career in programming I think learning C89 is the right way to go.

The language has been around for decades for a reason.
 
This reminds me of an article I read that was primarily about cha bu duo, but they touch on the importance of face.

I haven't worked with many chinese thus far. What was that like?
Depends on the Chinese. I worked with a guy from there who was born in the 70s and he had a very western attitude towards work. The younger generation is very concerned about face and social status though, and most mainlander Chinese you meet in the US these days aren't hardy commies who got out to try and make something of themselves but the children of government or corporate officials deep in the grift who are looking for western jobs for prestige, espionage, and/or a way to get their family out of the country if another Han happening goes down.

The rest of Asia has this face culture too although to a lesser extent. Taiwan, Japan, and Korea have spent enough time doing international businesses and having to compete with western industry that they've started trying to mitigate some of the worst excesses of face-saving culture. In my experience, the liberal democratic Asians will eventually admit that they're wrong and/or ignorant if you ask them a couple of times and make it clear that you're not out to cut off their balls for being uninformed.

Western-born Asians are pretty much just westerners with slanty eyes. A few of them do have the face-saving culture but it's incredibly rare and I think most of them lose it due to western schooling.
 
The language has been around for decades for a reason.
That reason is high cost of replacement and backwards compatibility. "Close to the hardware" is a meme honestly - the hardware that C is close to is a fucking PDP-11. C makes vectorization a pain in the ass, says nothing about pipelines and out-of-order execution, and until not so long ago didn't even consider parallelism. Meanwhile hardware manufacturers bend over backwards for C, because they know that a new architecture isn't worth anything if it can't deal with fifty megatons of legacy crap. Those fifty megatons are the real reason to learn C.

Edit: I forgot another fun one: Almost all extant CPUs automatically check if any arithmetic operation overflowed, but C doesn't expose that. In fact, C provides basically nothing for overflow checking at all even though this is super important for a language without bignums. At least that will be fixed in C2x.
 
Last edited:
That reason is high cost of replacement and backwards compatibility. "Close to the hardware" is a meme honestly - the hardware that C is close to is a fucking PDP-11. C makes vectorization a pain in the ass, says nothing about pipelines and out-of-order execution, and until not so long ago didn't even consider parallelism. Meanwhile hardware manufacturers bend over backwards for C, because they know that a new architecture isn't worth anything if it can't deal with fifty megatons of legacy crap. Those fifty megatons are the real reason to learn C.
Same thing with x86 assembly tbh. The actual way a modern x86 CPU runs has almost nothing to do with the instruction set. If we weren't cucked by proprietary shitware we could abandon backwards compat and actually embrace the glorious RISC revolution. But no, we need a million layers of obfuscation with unfortunate security implications so an i9 can emulate BCD instructions from the 70s in microcode.
 
I agree with your final paragraph. But I disagree with what you said leading up to that. C is the basis for pretty much all modern software including newer languages. You might not get a job programming in C. But understanding how C works will teach you a lot of low level details about how computers in general work. If someone is serious about starting a career in programming I think learning C89 is the right way to go.

The language has been around for decades for a reason.
I said I don't think C is a good first language (and also that "for Dummies" books suck). I have written in C for a long time and I enjoy it.

What "low level details" does C teach you? 99% of the C anyone will learn today will never touch paging, calling conventions, the difference between system calls and function calls, interrupt handling, registers, the integer value of pointers (the fuckery the GCC does with the value of pointers will make your head spin), instructing encoding, and memory layout. Sure, C isn't Haskell, but it's still fairly high-level.

I think learning C is good even if you don't use it regularly (the same goes for almost all programming languages) because it teaches you to think about what memory allocations you are making, and also how to think explicitly in references. (All programmers should have some grasp of influential programming languages like C, FORTRAN, Lisp, Standard ML, some dialect of Assembly, Smalltalk, and even Java.)
 
That reason is high cost of replacement and backwards compatibility. "Close to the hardware" is a meme honestly - the hardware that C is close to is a fucking PDP-11. C makes vectorization a pain in the ass, says nothing about pipelines and out-of-order execution, and until not so long ago didn't even consider parallelism. Meanwhile hardware manufacturers bend over backwards for C, because they know that a new architecture isn't worth anything if it can't deal with fifty megatons of legacy crap. Those fifty megatons are the real reason to learn C.

Edit: I forgot another fun one: Almost all extant CPUs automatically check if any arithmetic operation overflowed, but C doesn't expose that. In fact, C provides basically nothing for overflow checking at all even though this is super important for a language without bignums. At least that will be fixed in C2x.
I can't disagree with this statement enough. C is an incredibly simple language to learn and understand. Sure its side effects like double pointers might be complicated for someone that started programming in Java. But the rules of the language are simple.

You're mocking PDP-11 but in the grand scheme of things its not that different from hardware that we still run today. x86 is still CISC based and we still have to program compilers to use x86. Speaking of compilers C probably has more compilers than any other language out there. So while C might not be a perfect hardware abstraction it clearly does a better job than the competition. The hardware of many CPUs and microcontrollers is programmed in C so I find your claim absurd to the nth degree.

#include<limints.h>
Its not that complicated.

Also the point about C being backward compatible with hardware and software from the PDP-11 onward basically proves my point that C is the go to language when learning computer science for beginners. I could have said C++, but I know that it would filter 80% of CS graduates today.

I said I don't think C is a good first language (and also that "for Dummies" books suck). I have written in C for a long time and I enjoy it.

What "low level details" does C teach you? 99% of the C anyone will learn today will never touch paging, calling conventions, the difference between system calls and function calls, interrupt handling, registers, the integer value of pointers (the fuckery the GCC does with the value of pointers will make your head spin), instructing encoding, and memory layout. Sure, C isn't Haskell, but it's still fairly high-level.

I think learning C is good even if you don't use it regularly (the same goes for almost all programming languages) because it teaches you to think about what memory allocations you are making, and also how to think explicitly in references. (All programmers should have some grasp of influential programming languages like C, FORTRAN, Lisp, Standard ML, some dialect of Assembly, Smalltalk, and even Java.)
No denying that the "For Dummies" books are terrible. I understand where people come from when they say that C is not a good starting point but honestly if you can get to the point where you understand C really well everything else becomes easy. The hardest part of the language is probably pointers. And even the basic concept itself is pretty simple - just the implications with using pointers brings.

Maybe you have different 1st programming recommendation in mind, but all too often I hear people recommend an interpreted language like Python, a higher level systems language like Java, or one of the nu-tech languages like Rust. Python, or any other interpreted language, is significantly more abstracted than systems languages. Its OK for teaching algorithms and whatnot but as a language for real projects it has too many problems. It doesn't teach people about type systems (Python is sometimes called strongly, dynamically typed, memory management, and the design of the language encourages people to organize their code in such a way that makes it hard to start larger projects. And running into failures is one of the fastest ways to learn.

Java, also C# and several other languages, is a better choice because it at least teaches people early on the consequences of not thinking through their project design before starting. But they also are lacking because the languages have a garbage collector, they have a ton of vendor lock which keeps people running code on the same systems and teaching people to write code which is not portable because "It works great on Windows". Mixing and matching language and operating system is possible with C++ or C of course. Syscalls are a thing and I remember making my VisualStudio projects change the terminal text to green with a syscall when I was learning. But I got out of that habit because I learned that its not portable at all. In general programming inside of a sandbox like those languages can be useful in say the enterprise. But I don't think they are a great way teaching CS theory because "Java already has a UI library so there's no reason to teach the theory, only the API". The fact that these sandboxes also attract pajeet coders is also a draw back in my experience.

I'm not even going to start on the nu-tech languages like Rust, Golang, or any of the other language abominations. As far as I can tell real innovation in programming languages stopped in the late 90's early 2000's and now the "advancements" in this field just exist to enrich committees and offer a vector of attack for the woke commissars.
 
I think learning C is good even if you don't use it regularly (the same goes for almost all programming languages) because it teaches you to think about what memory allocations you are making, and also how to think explicitly in references.
This is the most important thing. I still see people get tripped up by the N+1 problem all the time.
 
I can't disagree with this statement enough. C is an incredibly simple language to learn and understand. Sure its side effects like double pointers might be complicated for someone that started programming in Java. But the rules of the language are simple.

You're mocking PDP-11 but in the grand scheme of things its not that different from hardware that we still run today. x86 is still CISC based and we still have to program compilers to use x86. Speaking of compilers C probably has more compilers than any other language out there. So while C might not be a perfect hardware abstraction it clearly does a better job than the competition. The hardware of many CPUs and microcontrollers is programmed in C so I find your claim absurd to the nth degree.

#include<limints.h>
Its not that complicated.

Also the point about C being backward compatible with hardware and software from the PDP-11 onward basically proves my point that C is the go to language when learning computer science for beginners. I could have said C++, but I know that it would filter 80% of CS graduates today.
I think you're confusing the "obvious" implementation of C that exists in your head (which is indeed simple) with the actual language standard and its actual implementations (which are anything but). I wouldn't call strict aliasing, (long*)((int*)ptr) being illegal, null pointer checks getting "optimized" out, or for(int i = 0; i < i+1; ++i); getting compiled into an infinite loop on GCC 5+ simple. Even if it was simple to define, that doesn't mean it's simple to use - look at any esolang.

If "CISC and needs compiler" is enough to make an architecture almost a PDP-11, the Lisp machines were almost PDP-11s, so I don't think that's a very useful point of view. The popularity of C doesn't say much about its merits either, or Javascript and PHP would have to be amazing languages. Like I said, C gets propped up because there is so much C around, not because C is so great.

limits.h is not too complicated, but like everything in C it is needlessly error prone and the compiler has to specifically recognize the check pattern to compile it efficiently to the hardware facilities despite being supposedly close to the hardware. The portable overflow check for signed multiplication is also hellish, but the recent switch to two's complement may have simplified it somewhat. You'd have to explain why backwards compatibility makes a language good for beginners, I don't see how that follows at all.
 
C doesn't have ideology. It's not some platonic form of the ideal computer or a perfected systems language or what have you. It exists because the guys at Bell Labs wanted to make Unix easy to port.

The entire schtick of the language is that it's sufficiently high-level to not want to blow your brains out when using it but simplistic enough that bootstrapping a compiler for it from scratch on a new architecture is easy. There's no baroque calling conventions you need to implement (languages of the era typically supported call by value, call by reference, and fucking call by name ffs), the built-in types are simple, the standard library is spartan, a lot of the harder problems of linkage and dependency management are foisted on the programmer rather than implementation, and it generally avoids a lot of the stupid shit that every language designer was shoving into languages back then.

Back in the day when we had a million different ISAs competing with each other, this was a great strength. I'd argue that since we've moved to a world where we have maybe 5 different major ISAs tops, the advantages of C aren't really significant anymore. Now that doesn't mean C is going anywhere but it's indicative that we should not necessarily be favoring it as a first choice for new projects.

EDIT:
Speaking of C, rms is apparently working on an equivalent to one of those massive O'Reilly books for GNU C specifically (the kind of book where it's both a tutorial and language reference). Rejoice my fellow cniles:

1662481703407.png
 
Last edited:
There already may be a discussion about this but I'm not fixing to look through 143 pages worth to find it.

Are there any sources or ways to start learning RTOSs? I know routers and other proprietary devices use real time operations and would like to dive into that..
 
Anyone got project ideas or know a place for doing projects?
currently learning Python n just started with JS but I wanna do something with it other than making clocks and calculators.
 
Anyone got project ideas or know a place for doing projects?
currently learning Python n just started with JS but I wanna do something with it other than making clocks and calculators.
Discord bots are a fun way to learn Node. If you don't have any ideas, consider making a music bot that streams audio to a voice channel using ytdl.
 
  • Like
Reactions: 1988 prick
There already may be a discussion about this but I'm not fixing to look through 143 pages worth to find it.

Are there any sources or ways to start learning RTOSs? I know routers and other proprietary devices use real time operations and would like to dive into that..
I'm an advocate of learning by doing, so I'd say just find a device that can run Contiki and start writing shit for it using the wiki to guide you. Although if you don't have a background in operating systems/systems programming already, you should probably start there and work your way up to RTOS.
 
  • Like
Reactions: AGuntyPaajet
Back