Anecdotal, but a lot of the junior devs I work with seem incapable of plain old "figuring things out". Every problem they come across, they send me the error, and ask how to fix it. When I probe into what they have done to troubleshoot... nothing. Bitch, I'm not your personal ChatGPT.
I want to agree, but I must point out that most programming these days seems to involve incomprehensible shit. These people have probably never used real languages and decent tools in their lives.
At worst, google the error.
While useful, I can remember when people used to complain about people using Google rather than thinking.
Understand the context, read some documentation, learn to use debugging tools. Anything.
The documentation is usually shit, and the debugging tools usually pretty bad. I've found strace to be more pleasant than GDB when I can get away with it, but avoiding both is best.
Perhaps this is just a naturally biased observation of being a senior engineer, but a lot of these kids can't do shit. Potentially a serious skills issue in the next decade.
The issue is these people have never been taught to think, perhaps, if they can. I don't know what environments and languages are in use over there, but I'll use Python as an example. I don't even use that shitty language, but I've read about it a decent bit. I vividly remember reading this article:
https://vorpus.org/blog/control-c-handling-in-python-and-trio/ (
archive)
Look at how complicated and unreasonably difficult handling C-c is in Python. The answer is to use the big fucking library someone else wrote. In Common Lisp land, I use UNWIND-PROTECT, which has a lineage back to TECO, and I don't have to give a fuck because it's a language primitive. Python, being anything but a real language, just doesn't have a good solution without contortions, and most people who teach it aren't inclined to teach how unsuitable it is for, really, anything. This isn't even touching on the shitty UNIX signal model. Can anyone reasonably fix it, or is he just expected to deal with this unacceptable shit?
That article dives into the internals of the Python interpreter itself. How many Python programmers are afraid to open up the hood and look around? How many who do would be able to get any value out of it, with how large that program is, in an entirely different language?
It's like this everywhere in shitty systems. The Python bytecode hack heavily resembles the hack Intel added for interrupt handlers, basically blessing a very specific sequence of code to solve an otherwise unsolvable problem.
This touches on a different issue also. Why should programmers be expected to find a random person's writing on the Internet to get this kind of knowledge? It's clearly because the documentation is shit. Everytime I see people praise
Beej's Guide to Network Programming and how important it was to them, I wonder why a single guy was able to write something so important to begin with; now, in this case it's because people don't even know where to find the official documentation, somewhat.
Argument to past stupidity is rampant in programming. The Linux kernel uses programming practices that were fucking retarded fifty years ago. Ask them, and they'll explain how they clearly can't have been fucking retarded for the past twenty years, so clearly they're doing things correctly. Why the fuck should a human being want to learn any of this worthless shit, especially if he's just in it for the money, which he likely is?
So, there's an answer: Almost no one gives a fuck, almost everyone who does give a fuck is wrong, and the systems are shit anyway.