It's like people even forgot what software is. Software is a tool. If the software is the right tool for the job, then the software is perfect. I can write a book in Wordstar on my 286. That software is from the 80s but will handle that perfectly well. The company who made it doesn't exist anymore, nor do I have the source code, nor might that source code even still exist anywhere. It still will handle the job, as it is a tool for that job. (as is the 286 running that particular sofware) Are there better or different tools for the same job? Perhaps. It doesn't matter. It doesn't diminish or invalidate that Wordstar can work in that role.
There's such a thing as scope and feature completeness. If Software 1.0 is an excellent text editor, but 2.0 also adds the ability to post pictures to social media and write drunken political emails to Null, it does not mean that 2.0 is better or more suited, as a tool, for the job I have for it, which is writing text. Why do I even have to explain this? Why does anyone? When has writing software and using software become some cargo-cult-esque ritualistic process where you are safe from the evil demons only if you do the right incarnations in the correct order at the correct times? Is this the result of programmers and users having no idea what they're doing?
I've been using a window manager that hasn't seen updates in seven+ years. Will my computer explode? No, because it manages windows in the same scope I needed it to seven years ago. This stability is a good thing, not a bad thing because I don't need to waste my time. I'm not sure how to put it in even simpler words.