That reason is high cost of replacement and backwards compatibility. "Close to the hardware" is a meme honestly - the hardware that C is close to is a fucking PDP-11. C makes vectorization a pain in the ass, says nothing about pipelines and out-of-order execution, and until not so long ago didn't even consider parallelism. Meanwhile hardware manufacturers bend over backwards for C, because they know that a new architecture isn't worth anything if it can't deal with fifty megatons of legacy crap. Those fifty megatons are the real reason to learn C.
Edit: I forgot another fun one: Almost all extant CPUs automatically check if any arithmetic operation overflowed, but C doesn't expose that. In fact, C provides basically nothing for overflow checking at all even though this is super important for a language without bignums. At least that will be fixed in C2x.
I can't disagree with this statement enough. C is an incredibly simple language to learn and understand. Sure its side effects like double pointers might be complicated for someone that started programming in Java. But the rules of the language are simple.
You're mocking PDP-11 but in the grand scheme of things its not that different from hardware that we still run today. x86 is still CISC based and we still have to program compilers to use x86. Speaking of compilers C probably has more compilers than any other language out there. So while C might not be a perfect hardware abstraction it clearly does a better job than the competition. The hardware of many CPUs and microcontrollers is programmed in C so I find your claim absurd to the nth degree.
#include<limints.h>
Its not that complicated.
Also the point about C being backward compatible with hardware and software from the PDP-11 onward basically proves my point that C is the go to language when learning computer science for beginners. I could have said C++, but I know that it would filter 80% of CS graduates today.
I said I don't think C is a good first language (and also that "for Dummies" books suck). I have written in C for a long time and I enjoy it.
What "low level details" does C teach you? 99% of the C anyone will learn today will never touch paging, calling conventions, the difference between system calls and function calls, interrupt handling, registers, the integer value of pointers (the fuckery the GCC does with the value of pointers will make your head spin), instructing encoding, and memory layout. Sure, C isn't Haskell, but it's still fairly high-level.
I think learning C is good even if you don't use it regularly (the same goes for almost all programming languages) because it teaches you to think about what memory allocations you are making, and also how to think explicitly in references. (All programmers should have some grasp of influential programming languages like C, FORTRAN, Lisp, Standard ML, some dialect of Assembly, Smalltalk, and even Java.)
No denying that the "For Dummies" books are terrible. I understand where people come from when they say that C is not a good starting point but honestly if you can get to the point where you understand C really well everything else becomes easy. The hardest part of the language is probably pointers. And even the basic concept itself is pretty simple - just the implications with using pointers brings.
Maybe you have different 1st programming recommendation in mind, but all too often I hear people recommend an interpreted language like Python, a higher level systems language like Java, or one of the nu-tech languages like Rust. Python, or any other interpreted language, is significantly more abstracted than systems languages. Its OK for teaching algorithms and whatnot but as a language for real projects it has too many problems. It doesn't teach people about type systems (Python is sometimes called strongly, dynamically typed, memory management, and the design of the language encourages people to organize their code in such a way that makes it hard to start larger projects. And running into failures is one of the fastest ways to learn.
Java, also C# and several other languages, is a better choice because it at least teaches people early on the consequences of not thinking through their project design before starting. But they also are lacking because the languages have a garbage collector, they have a ton of vendor lock which keeps people running code on the same systems and teaching people to write code which is not portable because "It works great on Windows". Mixing and matching language and operating system is possible with C++ or C of course. Syscalls are a thing and I remember making my VisualStudio projects change the terminal text to green with a syscall when I was learning. But I got out of that habit because I learned that its not portable at all. In general programming inside of a sandbox like those languages can be useful in say the enterprise. But I don't think they are a great way teaching CS theory because "Java already has a UI library so there's no reason to teach the theory, only the API". The fact that these sandboxes also attract pajeet coders is also a draw back in my experience.
I'm not even going to start on the nu-tech languages like Rust, Golang, or any of the other language abominations. As far as I can tell real innovation in programming languages stopped in the late 90's early 2000's and now the "advancements" in this field just exist to enrich committees and offer a vector of attack for the woke commissars.