Programming thread

I see those meme kids like that one little shit who "dreams in code", and that Indian kid who learned how to use metasploit and gives talks about security now and it just makes me feel inadequate. If there are kids who are better than me then why should feel like I'm good enough to go write software in the real world, y'know?
While I understand the sentiment and often feel the same about my own inadequacy, I came to realize a few years ago that:
  • The programming field is extremely diverse (in the good meaning of the word), so you're never going to be a master in all of the subfields. By subfields I mean stuff like systems programming, embedded, game development (especially the graphic stuff), machine learning/deep learning, language design and compilers, algorithms. And that's perfectly OK, there's no need to feel ashamed of not knowing some obscure data structure referenced in Cormen book. Just be honest with yourself and everyone else, yet never stop learning. If you find your job enjoyable, the last part will come naturally.
  • In re: talks and conferences: keep in mind that 90% of talks are utterly basic shit wrapped in a nice presentation. And yes, I'm speaking from a personal experience and from the both sides of the argument. Sure, anecdotal, but I know a lot of people having the same outlook and typically, the more "advanced" the person, the more they try to wiggle out of being sent to conferences because it's wasting their time. Conferences are glorifed networking events.
  • Don't fall into the trap of comparing yourself to a bunch of spergs shitting out a Medium post about "blarghing in Python3 is easy" or "frobnicating the UI in newest XCode". 90% of the time this is either a bunch of superficial feel-good bullshit and entry-level stuff or a PR campaign on the part of the author, so that they're more googleable.
As a matter of fact, a friend of mine reminded me of a practice used by some job applicants - they spam pull-requests to various open-source projects with superficial changes: update some tiny javascript package here, bump copyright year there, fix a typo somewhere else etc. It's used as a bragging tool during the hiring process, especially during the HR phase, where quantity trumps quality.

If you're a good programmer, you can debug. Using a debugger, if necessary.
This
 
I see those meme kids like that one little shit who "dreams in code", and that Indian kid who learned how to use metasploit and gives talks about security now and it just makes me feel inadequate.
I mean, every profession has its precocious geniuses. But in a profession the size of software development, you only need to compare yourself to the median programmer, who probably doesn't care that much about programming, let alone dream in code, and just does a regular 9-5 job. Many programmers aren't solving hard problems. They're plumbers, tweaking and hacking features onto huge balls of mud codebases that were like that when they started.

At first I thought you missed that this was an ncurses game, but then I looked on github and someone has actually ported the ncurses library to js for use on the web!
Oh, I saw. But you can always rewrite it in JS?
 
Just one more quick question for you; why do you consider the two project examples I gave to be of an employable standard? I see those meme kids like that one little shit who "dreams in code",
You mean Santiago Gonzalez?
The media made a big furor about the kid 8 years ago. "Prodigy!" they said. "The Next Steve Jobs!" they said.

Well! Fast-forward 8 years and the guy is, by all accounts, just another professional software developer working in machine learning and genetic algorithms research, studying for a PhD at UT. Sure, that's something for him to be proud of, don't get me wrong! But he's not the "ground-breaking paradigm shaker" that the media was touting him to be almost a decade ago. And as @cecograph hinted at, this "genius" probably counts among his peers an overwhelming majority of people who didn't have such a bombastic entrance to the field. Many of those professional software developers, doing similar things to what he's ended up doing, will have 'origin stories' more similar to yours. Don't count yourself out!
 
Jobs was a genius capitalist, not a genius coder. Of the two Steves that founded Apple, Wozniak was the far more brilliant tech guy.

He was a genius project manager too, particularly demonstrated in developing the NeXT cube and NeXTSTEP OS (later adopted as the default operating system by Apple), even if he wasn't personally capable of doing any of the ground level work.
 
The issue with being self taught is that I don't know where I stand, it's entirely possible that I already am employable but don't know it yet.

Most of the people involved in the hiring process think computers are sentient, and controlling them is some arcane wizardry. If you can demonstrate your wizardry in some way (showmanship over practicality always), you are employable
 
I never thought of it that way, I guess I overestimated just how 1337 your typical code monkey is. I obviously have an inferiority complex because I couldn't afford to go to real university, so maybe I actually am at a point where I should be trying to get work. If that's the case then I'm probably wasting my time with the study I am doing at the moment, I've done one and a half years coasting through this networking/hardware/sysadmin 'bachelors', just to get a piece of paper.

As a person who has some experience in hiring, I want to tell you that you would be very surprised how many people apply to programming jobs without any serious skill, to jobs that are way above their skill level, and with a github that consists of 15-lines of code "projects" where half of these lines are lines with curly braces only. I kid you not!

I am a bit frustrated these days, because the guy I was talking about some time ago, this time spent already two weeks on installing a library and running a code example. Finally, I decided to check for myself if the library is so hard to install. True, you need to troubleshoot a little, but make it running took me about an hour... Of course, I didn't tell the boss about it, I am not a snitch.

I know that it's normal that people pretend that they work etc. but it irks me. The funny thing is that couple of months ago he was complaining about people in his previous company that just pretended to work...

Well, what can I do? I will just focus on my tasks, and hope he will not derail the project further. I am pretty sure that I will have to take over his task at some point, as it happened already three times before...

I think he should be fired. But then pandemic and all, I suspect he will not be able to find a new job fast, because before our company he had a 2 year gap in the resume.
 
As I mentioned before, I made a quick game in SDL a while back. Just started working on another one in MonoGame (learning C# on the fly). One of my CS courses pretty much entirely consisted of writing programs in SFML, and so far MonoGame seems to be basically that except in C# and with way less boilerplate.
 
I had to learn some Python/C++ and Wolfram for numerical analysis and computational algebra back in the day. I'd really like to jump into all this coding thing, so I've been reading about OOP and data structures, but I find myself unable to really find a project I want to make. Besides: I see what you software/web developers are supposed to know and I feel overwhelmed. I now feel a lot of respect for software engineers.

Any advice? Is it normal to feel this way?
 
Besides: I see what you software/web developers are supposed to know and I feel overwhelmed. I now feel a lot of respect for software engineers.

Any advice? Is it normal to feel this way?

A sense of being overwhelmed via information overload is a guaranteed occurrence in just about any aspect of programming. This is common and expected; there's always an eternity of API jungles, documentation, system manuals, and new language features getting in the way of your next problem solution. Ignoring documentation, software design itself is a constant struggle to maintain a mental map of where you are and where you need to go (knowing when to expand and when to suppress information), especially as your software grows in size and complexity. These are the challenges that are core to the programming experience; you will get better at managing them with time, but they will always remain challenges.

As for learning, one of the key ways to become overwhelmed is to skip the fundamentals and move into complex abstractions. I heavily recommend ignoring all design paradigms and abstractions that come with them until you have a basic understanding of computing essentials. Programming and math share quite a few things in common, and one of those is the following trait: while there are endless labyrinths of deep complexity, the core fundamental skills required to navigate any of the said labyrinths is actually quite small and manageable. Once you have that set of skills, you can navigate any labyrinth you desire with a bit of effort.

The reason I don't recommend starting with OOP or data structures before learning computing fundamentals is the same reason I wouldn't recommend a book on tensor calculus as the starting point for learning calculus. If you don't learn about limits, derivatives, and antiderivatives (and learn them well), all advanced topics are pointless and will only lead to confusion and a sense of "magic." The same is true if you start with some advanced paradigm like functional programming or object oriented programming without first learning the common language all of these paradigms must internally share.

EDIT: As for project ideas, that one is much more difficult. Keep your initial focus on learning a language + fundamentals; do simple, throwaway projects that are purely about developing basic muscle memory and reinforcing pedagogical theory. At some point, though, you must start working on something that genuinely interests you or it will be very difficult to keep going. There is no easy solution there, your interests and motivations are only known fully to you; you must figure that out on your own.

Here's my recommendation for a programming learning path:

Gain a general understanding of the following principles: basic computer architecture (viz., cpu, main memory, system bus), basic software memory layout principles (stack / heap, endianness), basic bit representation of fundamental types (two's complement, floating point, ASCII / Unicode), and finally a basic understanding of assembly (learn basic x64 instructions as your base because these are the most relevant). Armed with this knowledge, see how you manipulate these via a systems language like C. You must use a systems language that actually exposes lower level principles clearly otherwise you will struggle immensely to see the concepts overlap properly. I recommend C because it has a nice sparse syntax and because every other language you use will end up having to interface with it anyway, so learning that root language will unlock all sorts of understanding that is black magic to many future peers. Learn pointers and memory manipulation well; all other languages will try to obscure and complicate the simple, basic operations that C affords. Pointers are not difficult in the slightest to learn, but they will make your life hell if you don't understand them and start using languages that don't dare utter the word. Malloc is stupidly simple, but inherently tedious and fraught with opportunity for error (this is a description that applies to all primitives in any discipline, which is why people like to spend such little time with them). Examine the assembly that your compiler outputs. Use tools like Compiler Explorer but also learn the "hard" way via something like objdump/gdb so you know alternative options.

This might seem like it takes "more" time than just learning OOP/FP and immediately getting "results" with a framework, but the amount it will save you over even a few years of programming is immense. The path of least resistance is actually the one I just described: learn the fundamentals well and you can have a lifetime of understanding. Learn to read/write or be forever condemned to memorizing static passages written by others (and continue to think that "objects" are something other than heterogeneous arrays). Besides, with the content I listed I can get you a list of about 3 books that will cover ALL of that, compared to the 200 books and 10 languages and 50 frameworks you'll start and then quit looking for a "better" way to start programming.

Once you've got the fundamentals learned, you are ready to navigate the troubled, deep waters of modern programming. You will be able to pursue whatever paradigms you want, rip apart any APIs at will, and you'll be able to figure out what the hell is going on when problems surface. You won't be beguiled by expensive abstractions wrapped in jargon; you'll be able to decompose any opaque abstraction into a set of manageable primitive operations (the same way you decompose a definite integral into a set of basic algebraic manipulations) when you need to (provided the language isn't obtusely obscured, in which case, find a new language). You'll also understand that even though there are seemingly limitless options in terms of frameworks, APIs, etc., they're all doing the same small set of operations in different, tiresome ways.

One word of warning: don't seek mastery on your first pass through the fundamentals. You'll never find it. Instead seek general competency and a solid but basic understanding. e.g., don't spend forever memorizing CPU implementations, just know the general principles (CISC / RISC / ARM). If you ever need to actually learn a particular instruction set + model, you can, but don't bother with the information until you need it. What you need a is a solid understanding of a small set of fundamentals, then you can specialize when required. You should revisit the fundamental sources whenever possible to refresh your mind about what is going on; seek deeper knowledge of assembly and memory whenever possible. Keep rotating high level and low level practice and your skill will grow immensely. The banal primitives will always remain the most important things to keep in your mind; the good news is that they're the simplest and easiest to remember. Like a musician playing scales, don't forget to warm up regularly.
 
Last edited:
The reason I don't recommend starting with OOP or data structures before learning computing fundamentals is the same reason I wouldn't recommend a book on tensor calculus as the starting point for learning calculus.

I'll take your advice.

This particular comment is on point; my background is in mathematics, so this analogy helps me understand what to expect. I'll go back and start from the beginning.
 
and finally a basic understanding of assembly (learn basic x64 instructions as your base because these are the most relevant). Armed with this knowledge, see how you manipulate these via a systems language like C.
I don't think assembly is worth bothering with for a beginner. You already listed "computer architecture" in the fundamentals which I agree with, so I would say just start with C but be aware that C is an abstraction, machines operate on machine code, and they really work the way your architecture works.
 
I hear you can do pretty well these days if you just learn how to glue stuff together with basic Python scripts, especially if you've got a maths background. You can probably get away without having to ever write a class.

And honestly, in the web companies I've worked at, the most code by far was being produced by the frontend coders. I don't think they needed to care much about anything as low-level as endianness, and I think they were probably having the most fun.

I don't think there's any royal road to programming, at least not in a way that will make you employable. C is ubiquitous, but it's not a timeless testament to what programming is about. It's a dumb bit of tech that unfortunately got ubiquitous on the back of the equally dumb tech that was UNIX. We love it, but only like a parent loves their retarded child. When pushed, you have to admit it's a tragic state of affairs.
 
C is ubiquitous, but (...) When pushed, you have to admit it's a tragic state of affairs.
Compared to what? What language would you recommend instead that's at a similar point on the "high level <-> low level" spectrum?

I think C is actually quite good, as long as you're not talking about the clown-shoes anything-goes early '80s C.
 
I don't think assembly is worth bothering with for a beginner. You already listed "computer architecture" in the fundamentals which I agree with, so I would say just start with C but be aware that C is an abstraction, machines operate on machine code, and they really work the way your architecture works.

I think it's vital to be exposed to assembly as soon as possible. It's especially important to understand early on that all programming languages are high level and that instructions do not map 1:1 to statements. I don't think the amateur / journeyman should have any particular assembly mastered, but they absolutely should be familiar with the fundamental concepts and common operations that occurs across assembly (e.g. labels, memory reservations, common operations like mov, lea, and so on). Knowledge of assembly helps you in many areas where you don't even look at assembly, for example, understanding calling conventions is important for writing FFIs.

Furthermore, if you can't read assembly then you needlessly deny yourself really helpful tools like Compiler Explorer. Again, you don't have to have Intel x64 SIMD instructions memorized, but you certainly should be able to read enough to make a general assessment on the differences between -01 and -03 on a particular piece of source code.
 
Compared to what? What language would you recommend instead that's at a similar point on the "high level <-> low level" spectrum?
The other languages are long dead. You use C because it's near enough the only systems language left, and ubiquity has always been its own reward in tech. Merit comes a distant second.

But there was a time when C wasn't the only game in town. On the Lisp Machines, the only recommended language was Lisp, which bridged the high and low level spectrum far better than C ever has or will. C never crossed any spectrum even in its conception, which is why UNIX had to invent shell scripting languages when Lisp machines didn't.
 
So I was able to use BeautifulSoup and Selenium to finish a program I wanted to make. It's a little thing, but it checks a website every few minutes to see if products are in stock and then emails me if they appear. It's not much but it's only the second program I've written for myself that I find actually useful, and does something I want.

I signed up for a Udemy course that teaches Scrapy and Selenium and so far it seems really good. I'm hoping for my next project to write a program that puts the items in the cart and actually purchases them. I'm sure other people are doing it on this site and the site doesn't stop them, and they always beat plebs like me trying to buy stuff.
 
Ledj Posted a lot of good advice and I couldn't agree more with this sentence specifically:

Keep rotating high level and low level practice and your skill will grow immensely. The banal primitives will always remain the most important things to keep in your mind; the good news is that they're the simplest and easiest to remember. Like a musician playing scales, don't forget to warm up regularly.

I would like to add re: projects. For me what worked in projects was doing something I could use. If you don't have any tool in mind, maybe someone you know needs a tool, and you can help himself/herself and yourself at the same time.

In the worst case if you still have no ideas, enroll in one of Udemy courses (they are like $10 and some are really good) that include projects. They will help you get started, and then you can expand on projects from the course and add some features you would like it to have.

Personally, when some years ago I wanted to learn Python, I took one of the courses with projects. The projects were quite simple, but then when I got an assignment in college, it was super easy to adapt the project skeleton to my needs, and I ended up impressing professor with a web app written in Python and hosted on pythonanywhere. If I had to start project from scratch it would be much harder.
 
Back