I made an account to comment in this thread. Some of you have the right ideas, but others seem to have misconceptions of the reason why computers exist. Computers exist to automate work, and can be worse than useless if the automation be flawed, no differently than clocks exist to tell the time and how a wrong clock can be worse than no clock. I've seen some of this in this thread, but decided to read only the last fifty pages. Any roadblock to automation makes a computer flawed in a different way. The automation of automation is also very important.
Nonsense such as the C programming language is an affront to good design, and directly impedes correctness and automation. At best, one should be lightly familiar with the language purely to interface with it from better languages, such as Ada, and to argue about it on the Internet; many bizarre constructs in the C language are unknown even by programmers who claim to know the language. It's neither simple nor reveals the inner workings of computers; machine code is suitable for some of that, but many C language programmers ignore everything below it without good reason.
I've not seen APL mentioned at all, although the search functionality for this website may not allow for searching such short strings. New programmers should probably learn APL before anything. It's not particularly suited to automating automation itself, but is so concise that it won't matter for many problems; furthermore, it teaches a style of programming that is implicitly parallel, and can prevent a new programmer from getting stuck in the one-at-a-time thinking otherwise so common. The core of APL programming is transforming and shaping data like a sculptor, until arriving at the solution, and this is how most all programming in general should be done. GNU provides
an implementation of APL that is suitable, but I can't give advice for learning it beyond reading some old books.
The entire state of computing is rotten, and no suitable computers currently exist; Free Software groups are generally preoccupied with poorly mimicking proprietary software that isn't very good, such as the Linux kernel being tens of millions of lines for a part of UNIX, and without even a decent memory exhaustion strategy. Alan Kay and others made good progress towards reinventing computing in the form of the
Viewpoints Research Institute, but that research is now defunct. Regardless, it's still worth a glance every now and again. Personally, I take issue with what currently passes for machine text, for both programming and human languages; better can be done for both ends.
I'd be glad to share some educational resources from myself and others, if there would be interest.