- Joined
- Sep 17, 2019
I dunno. I sort of agree with you, but then again I just finished an Arch Linux reinstallation last night (went okay!) and one of the system admin pages of the ArchWiki had something interesting to say about the UNIX "Everything is a File!" mantra that's been fresh on my mind. (And yes, I'm that dork who reads the ArchWiki cover to cover while doing an install...)If programs were functions, they would compose just like any other function with even less ceremony:
Code:bar x | foo
is just
I appreciate that I'm in a minority here, and lots of people think that UNIX pipes are cool. To be fair, it is a good idea in a highly networked world to make sure you've made streaming data a priority. But aside from streams, the fact that anyone thinks UNIX pipelines are pleasant is proof of how brain-damaged UNIX has made us, and apparently this brain-damage is so bad that it's infected Powershell.Code:foo(bar(x))
I mean, what the hell is the thought process?
"We have small units of functionality in UNIX called programs. These programs have an input and output."
Okay. So far so good. You've got functions.
"These programs input and output text."
Okay, hang on.
"To send the output of one program into another program, you ask the kernel to create a special data structure called a pipe with an input and an output end. You tell the first program that its output is the input of this pipe, and the second program that its input is the output of this pipe."
Err...what the fuck? If you want to send the output of a function f to a function g, why can't you just do g(f(x)).
For example, it isn't just programs and data. The file metaphor also extends directly to hardware peripherals:
... One of the most important of these is probably the mantra: "everything is a file," widely regarded as one of the defining points of UNIX. This key design principle consists of providing a unified paradigm for accessing a wide range of input/output resources: documents, directories, hard-drives, CD-ROMs, modems, keyboards, printers, monitors, terminals and even some inter-process and network communications. The trick is to provide a common abstraction for all of these resources, each of which the UNIX fathers called a "file." Since every "file" is exposed through the same API, you can use the same set of basic commands to read/write to a disk, keyboard, document or network device.
So in that regard, it seems pretty elegant: as long as your 'thing' that you want to interact with—virtual or physical—can be crudely bent into the same file metaphor (can you 'write' to it, whatever that means? And can you 'read' bytes from it, whatever that means in your context?), then all of a sudden all of the UNIX tools can work with it with 'no dramas', and you can chain them together with pipes and have fun and all that. Well, in ideal, anyway.A simple tool, such as cat, designed to read one or more files and output the contents to standard output, can be used to read from I/O devices through special device files, typically found under the /dev directory. On many systems, audio recording and playback can be done simply with the commands, "cat /dev/audio > myfile" and "cat myfile > /dev/audio," respectively.
Of course, I'm not fangirling for UNIX or anything! I still have the same reservations that I imagine you do: the function (or object) abstraction seems like it would actually be simpler and more flexible these days. But I guess for its time, back when all of the hardware peripherals were simpler too, the file metaphor was a neat way to get everything conforming under a simple abstraction, without having to worry about adapters for incompatible interfaces or anything like that (much).