The Linux Thread - The Autist's OS of Choice

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
A lot of features 'introduced for business reasons' ironically tend to be amazingly bad for the business
It's usually for harvesting or because c suites are like magpies who think new features can't ever be bad and just go with it.

Anybody worth their salt in an enterprise environment will see "business reason """"upgrades"""" " for what they really are and debloat the fuck out of any OS image prior to deployment across the business/using it to build new devices. See HP wolf security as a good example, that thing is a heap of bloatware shit and does nothing that an enterprise EDR/NIDS cannot handle. It's literally useless in enterprise (and quite frankly in general for your average nigger). If your desktop support team don't pry that out of the image used for new Windows user devices along with the HP help centre and other unnnecessary garbage they all need to be shot.
 
(Long one)

Thing is though, find and awk are external utilities and not part of bash, which is also only one of many available shells. If you have a better solution for these tools, you are free to use it. Or script one for your specific use case by gluing several, simple tools together, the Unix way.
Respectfully I'm going to say I think you're missing some of what I'm getting at. Regards the separation of say find and awk from bash, one of my points is the pipelining in Bash is all text mangling and that's where a lot of its fragility and overloading comes from. Even were you to write your own tools you don't have object pipelining with a legion of inbuilt types that the Windows OS has. You can't have that even if you cobble together some object pipelining in Bash through some Frankenstein because Linux doesn't expose itself as objects the way Windows does. It's just not a Bash thing. I could write a bunch of Python equivalents to GNU tools that I thought were better but I still wouldn't be able to quickly assemble them the way I can do in Powershell because Bash pipelining is just text. Further, I mentioned the standardised way cmdlets are written - they follow a clear and modern standard that requires things like exposing its parameters and documentation to the shell so that, for example, I am able to tab-complete through the options of one even if it's the first time I've ever used it. Bash can't do that with the flags for find or awk and you probably can't write it at this point because long-standing basic tools aren't written to any such standard. That's not just a UI trick, it's the sort of thing that directly comes about by having planned the environment out to have its tools all derive from a cmdlet class.

So the distinction you're drawing here misses that there is a close symbiosis between the shell and the tools written for it. Which ties into the "if you have a better solution for these tools you are free to use it" where I'd say it's even more so for the reasons above. As well as other reasons such as Powershell supports modern exception handling. So whatever tools I write can raise standardised exceptions that the environment will better handle, not just return an integer. So again, the environment influences the nature of the tools you deploy in it.

I'd extend that to scripting one for a specific use case as well - having proper exception handling, in-built verification of script signatures for secure distribution of whatever you write and, well, since you raise scripting lets introduce another comparison. What bash script writer hasn't used IF in their script. A basic control flow operator which in Bash blows up if you get the whitespace wrong, uses = for comparison when using it for assignment elsewhere, uses = or -eq depending on the type of the data you're comparing, or different syntax for parsing a variable for comparison if it's a string or an integer (and a gotcha for reals). And sometimes I need single [ ] and sometimes double [[ ]]. Arithmetic comparisons can use ( ). And of course all those else and then's.

In Powershell:
Code:
if (myValue == null) {
    continue;
}

And it will always be that standardised.

If I wanted to do my own version of Select-Object I could just go to the open source here and create a derived class from it and because Powershell tools are a lot more true to the UNIX philosophy than GNU tools are, it would actually be a lot easier because they're small and tightly focused. There's also the in-built support in Powershell for signing tools to make distribution easier which goes beyond whether your tool is approved for your distro's repository or not. Which again, is highlighting that the distinction between tools and environment is less clear-cut than suggested, imo.

Anyway, yes you are correct that some of my comparison did bring in the tools as opposed to purely the scripting language itself, but I think I've argued here that the one informs the other. Without object pipelining you simply can't write your tools in the same way as can be written in Python. They're either constrained or you're implementing your own Frankenstein shim to try and emulate it and this wont be compatible with all the existing tools. Object pipelining goes to the hear of why I was able to extend my original PS script simply to accommodate further requirements but with Bash I ended up changing my approach fundamentally.


There-in lies the actual strength of the fairly heterogenous Linux ecosystem. If something doesn't feel intuitive to you, build a solution that is. I know this sounds failry "tl;dr" or dismissive, but is not meant that way at all. You can do some real magic. I notice it's something people coming from other OSes and workflows often don't seem to understand.
Not taken as dismissive or hostile in any way, no worries. I am enjoying this. But to be clear as mentioned in my original post, I used Linux (well HP UNIX 11 actually) before I used Windows. My progression of OSs has gone Sinclair Spectrum -> Amiga -> UNIX and Linux -> Windows. I confess I'm not guru-level with Bash, especially after not using it professionally for a few years, but I'm not a Windows user suddenly arriving here without any background.

Regards Linux being more heterogeneous than Windows, well sure! :) Everything in the Windows OS is an object. Whether I'm calling Get-ChildItem or Get-Acl or Get-Credential in my Powershell script, they all follow the same principles and interoperability. This isn't really surprising, it's the leap-frog effect. Microsoft was able to look at everything that was learned from Bash over the years and start with a nice clean slate. Systemd is trying to bring more uniformity to how you interact with the OS in Linux but it's well, systemd. In the sense that you mean though, I don't think there's anything that limits you on Windows from writing your own tools compared to Linux. But if there's less need to, that's a win for Windows.

They're always looking for one standard way to do things, one baseline, not understanding that it is often truly meaningless for the job at hand to have one. I'm sure some people will now come with the usual "I'm too busy and too professional and important and I can't be assed to do something like that" but to be honest, I've seen people spend insane amounts of time and energy on bizarre workarounds to force their proprietary OS of choice to do a thing a specific way, which would've never happened if they just used Linux.
Respectfully, I'm going to press 'F' to doubt. My experience in Enterprise is that the UNIX sysadmins were far more inclined to disappear into their own navel for ages and leave a trail of custom scripts around than the Windows admins.

Great, it's intuitive. I can remember `grep` though, I can't remember whatever on earth that is.
I can remember grep too. We both can because we've been using it for ages. But you can't remember Select-String? Can you always remember things like the numerous string interpolation flags like %n, %c, %Hd that are built into the stat command and unique to it? Or is it easier to just reference a named property of a file object like LastWriteTime? Take aside the years of familiarity and which is honestly easier to remember?

I sort of wish MS had just straight-up ported Bash, Powershell is far too wordy to be useful.
It exists, you can use Cygwin on there. Or if you're using WSL you have native Bash because you're running an actual Linux kernel with access to the Windows file system. But as I went into, every part of the Windows OS exposes itself as an object and Bash is text-based so they don't fit well together.

I'm not sure what's up with the useless use of awk, but the proper, totally readable solution is:
touch --date='365 days ago' /tmp/365 ; grep -lr mystring mydir | xargs -I ^ bash -c '[[ ^ -nt /tmp/365 ]] && stat -c %n\ %z ^ '
Perhaps the best criticism of Bash is simply the fact that I genuinely cannot tell if you're joking or not. :)

No, the realistic example is
Bash:
find ./ -name "*myString*"
Um, no. Your command searches for files named according to a certain pattern. My example is searching for files containing a certain pattern.
I think a lot of your issues come from just misunderstanding these commands or lack of knowledge. You don't need grep because find already has the -name option, and you don't really need xargs either
I must really be missing something here. That's not how find works. Oh, okay - I've skipped ahead and @davids877 has pointed this out. I'll address the other parts that are still relevant.
Powershell also accesses the filesystem for every single file, it just does this at the beginning and then it works with the data in memory.
That's kind of my point. It reads in file objects and passes these on via pipelining. It's not about whether it needs to access the same number of files it's whether it needs to access the same files multiple times. It doesn't know or care what the next step is but that next step is less likely to have to go to the file system to get the information it needs as opposed to getting a list of string representations of file locations that are nothing more than that. I used stat in my example but it could easily be some custom cmdlet or script. It's good modular practice to be able to treat the recipient as a black box so that people can work independently and so that changes don't break anything. When you're transferring information as lines of text with delicate format, it's a lot harder to ensure that modularity. Just look at anything involving Awk and how it imposes strong expectations on the output format of any predecessor in the chain. Again, direct consequence of text-based pipelining.

And that's another thing you've just brought up. With Powershell I have two ways of passing my set of file objects. I can do this:

Get-ChildItems -Recurse | Format-Table -Property FullName, LastWriteTime

or I can do this:

Format-Table -InputObject (Get-ChildItem -Recurse) -Property FullName, LastWriteTime

What's the difference? Well the former pipes each object to Format-Table individually and the latter passes an array of file objects in one go. So I have a great deal of control over how to tie different commands together in Powershell that allows me to write the most performant code with the most appropriate error handling. Someone who knows Bash better than I do correct me but I don't think there's an easy way to do that in Bash.

To be honest I think linux should probably always be the 'get filtered' operating system. Companies can always spin off stuff like android and steamOS for the retards and still cast darts at microsoft without niggerifying the userbase even further
This only works if management are willing to actually enforce filtering and you are actually willing to endure another year of sixty hour weeks because they can't find anybody who can pass the filtering. What I actually see in IT is not "this is a complex job so only smart people get hired" but rather "people get hired and the job too complex for them". You know I'm right.bbcbc
You're right, I misread while skimming his wall of text
Oooh, meow!

find ./ -type f -name "*myString*" -mtime -365 -exec grep -l 'string' {} + | xargs stat -c %n':'%z | awk -F: '{print $1, $2}'
That's still wrong. Even after @davids877 corrected you, you're putting -name "*myString*" into find. Lets just ignore that part though and pretend you didn't add it. This is actually good. - you're now pretty making my points about intuitiveness and readability and performance for me. Lets pick this apart.

1. The Powershell version handles calendar dates. I said twelve months, you've done 365 days. I could have easily said one month and then days becomes even less of an accurate match than just being off by one now and then. Do the above with an actual calendar month and compare it to (Get-Date).AddMonths(-1). This will be fun. :)

2. Far less intuitive. You can't convince me all the command specific flags such as %n and %z which are built into stat are as intuitive or easy to recall as dealing with actual named parameters of standard objects. The file object has LastWriteTime and it will always be that property regardless of the command operating on the object. Same for any other properties of the object like ACLs (permissions) or others. You don't get commands implementing their own command specific terms for its properties because that's fundamentally against the concepts of Powershell.

3. Performance. You're incorrect about the number of file accesses. Even thought you've put them in a combined statement your sequence is still: Recurse through all files, pass each one to a separate grep call, pass each result to a separate state call. That's three accesses - willing to be corrected if Bash is smart enough to combine these but I don't think it is. Now comparing with Powershell it's more performant in two ways:

Get-ChildItem -Recurse *.* | Where-Object { (Select-String 'myString' $_) -and $_.LastWriteTime -gt (Get-Date).AddMonths(-12) } | Select-Object -Property FullName, LastWriteTime

a. File accesses. We have the recurse through the file system, we have the access to each file to search through the contents. That's it. We're 2/3rds the number of file accesses.

b. Method of passing. This is a fun one. Unless I'm mistaken each pipeline in Bash will be a separate call. So for each file found, it makes a call to grep. For each positive grep call it makes a call to stat. Mine does that too but if I know I'm dealing with a large number of files or plan to, I can handle things better. To illustrate I'll need to make a small change and swap that last cmdlet from Select-Object to Format-Table to make the point. But the output and principles are the same. So here is our original approach (barring using Format-Table):

Get-ChildItem -Recurse *.txt | Where-Object { (Select-String 'Bank' $_) -and $_.LastWriteTime.AddMonths(12) -gt (Get-Date).AddDays(-1) } | Format-Table -Property FullName, LastWriteTime

But Powershell also lets me write it like this:
Format-Table -InputObject(Get-ChildItem -Recurse *.txt | Where-Object { (Select-String 'Bank' $_) -and $_.LastWriteTime.AddMonths(12) -gt (Get-Date).AddDays(-1) }) -Property FullName, LastWriteTime

What is different? Well I'm now passing the array of file objects all at once. No longer am I simply pipelining things as they come in, I'm building a single array of the objects and passing it over all in one in a single call. Much more performant. And it was trivial to change my approach. We're working with simple examples here but this all applies to if I were dealing with more sophisticated in-house cmdlets that maybe a team mate wrote.

It's good stuff! :)
 
I'm not sure what's up with the useless use of awk, but the proper, totally readable solution is:
touch --date='365 days ago' /tmp/365 ; grep -lr mystring mydir | xargs -I ^ bash -c '[[ ^ -nt /tmp/365 ]] && stat -c %n\ %z ^ '

I was going to try and do it in a Perl one-liner, but those braincells have gone missing.
He did that awk there because he wanted some specific format, think he wanted to remove the timezone stamp.
What is different? Well I'm now passing the array of file objects all at once. No longer am I simply pipelining things as they come in, I'm building a single array of the objects and passing it over all in one in a single call. Much more performant.
Didn't read the rest of the post because I don't really care, but I appreciate your passion!

You can still do all this in bash, it'll just take some more lines of code so it won't be as easy or as pretty. Filesystem operation performance really isn't something I have ever encountered as an issue when using either powershell or bash, and if performance is your main goal you probably won't go for a shell in the first place, but for an actual programming language which also has a much better syntax than powershell.

maybe a team mate wrote
Ah that's why you do walls of text, you work for a company that uses Windows, lol. Why are you guys building internal tools in powershell?
 
  • Agree
Reactions: 名無し
Didn't read the rest of the post because I don't really care, but I appreciate your passion!
I mean, you didn't really read the first one given your attempt to rewrite my Bash was wrong. But if you're not interested in what I wrote why bother reply at all? And if you haven't read my post how do you support statements like "you can still do all this in bash" when you don't even understand what's been raised? And for things like this, they just suggest you really don't have much understanding even of Bash or Linux:
Filesystem operation performance really isn't something I have ever encountered as an issue when using either powershell or bash,
File system operations are one of the largest bottlenecks in most tasks, beaten out only by network operations.

and if performance is your main goal you probably won't go for a shell in the first place, but for an actual programming language which also has a much better syntax than powershell.
If the bottleneck is accessing files then it doesn't really matter if you write it in Powershell or hand-code it in C, your performance is constrained. And all this because you can't even count the number of file accesses in your own Bash example and because you think reducing file access by a third is meaningless because in your career filesystem access has apparently never been the bottleneck in a script! Somehow! :)

So if you're not interested in what I wrote just skip it entirely but as you are stating a position then please tell me what this "much better syntax than powershell" is that you refer to. Seeing as you are unwilling to compare Powershell with Bash (the point of all this) let me know what your point of contention with in comparison to "an actual programming language"? This should be good.
 
  • Like
Reactions: XANA
It's looking like LMDE 6 may be the linux distro to recommend to new users as it's highly stable and cinnamon is working really well.
Are you using it yourself, any thoughts? Debian is my-go to distro for personal use but I drifted towards Mint because its so pleasant to use. Mind you, Gentoo used to be my home system so relative... ;)

I have to spin up a new Linux laptop so may give this a go.
 
Are you using it yourself, any thoughts? Debian is my-go to distro for personal use but I drifted towards Mint because its so pleasant to use. Mind you, Gentoo used to be my home system so relative... ;)

I have to spin up a new Linux laptop so may give this a go.
I'm still just using Debian on my server. From what I can tell LMDE should be as stable as a Linux Distro can possibly be while providing an interface that is intuitive and user friendly. Biggest drawback is that it won't have the latest version of apps and drivers so games may not have as good of performance as not cutting edge distros. But if LMDE can prove itself as a stable distro it easily becomes the distro you could recommend to someone who just wants something that works.

That being said, looking at the charts it seems that MX Linux is more popular, but I know very little about it. it has XFCE as the default window manager so it could probably be very good for gaming
 
Last edited:
That being said, looking at the charts it seems that MX Linux is more popular, but I know very little about it. it has XFCE as the default window manager so it could probably be very good for gaming
I used Xubuntu for a while on an older laptop because I figured Xfce would be lighter and quicker. I like to think it was but in all honesty I never really noticed that much difference. Not really a gamer though, fwiw.
 
I used Xubuntu for a while on an older laptop because I figured Xfce would be lighter and quicker. I like to think it was but in all honesty I never really noticed that much difference. Not really a gamer though, fwiw.
THe more I look at MX Linux the more it pisses me off that I never really noticed it before. it has a "Advanced Hardware Support) edition that has the 6.6 linux kernel, has support for SysVint, and appears to be a rock solid generic linux distro with a version that would be very good for gaming
 
  • Informative
Reactions: teriyakiburns
In Ruby, this very synthetic example problem looks like:

Code:
Dir["."].select{|x|x.mtime>1.year.ago}.each{|f|File.readlines(f).any?{|l| puts "#{f} #{f.mtime}" if l.include?("myString")}}

There's probably some idiomatic way to do it more compactly, and probably a better language for compact representation. But this is closer to an oranges vs oranges comparison. "*.*" as a search query on Unix will not return files unless there is a period in them. Here, you'll completely avoid any of the ridiculously edge-case bloat that is an "object pipeline".

In practice, in bash, I fully decompose each of these steps: gen a filename list with find. Look at it. Gen a list of grepped files that match your query. Look at it. Gen a report with your formatting from those files. Look at it. Submit. You're not composing the steps into a pipeline unless you're trying to save a few seconds when you hit this problem again.

The notion of "exception handling" here is absurd. Your exception handling is you looking at the results to see if this looks sane. But if you're really all-in on the OOP model, the Ruby solution will manage exceptions for you.
 
  • Informative
Reactions: 419
In Ruby, this very synthetic example problem looks like:
I don't know what's 'very synthetic' about it. I frequently find myself grepping through files looking for something. And it's not that rare that I want to limit it by some criteria such as "I know it is a file from the last month" . And in a work context I can well see cases for this. But really the point is to compare Bash and Powershell as these are the two primary scripting languages on the two biggest OSs. And if you want to throw MacOS in with its Zshell that pretty much counts too. And for purposes of comparison I don't see a gain to doing more complex for the sake of it. My example shows and contrasts the differences I wanted to talk about.

I'm not sure Ruby is a more "oranges to oranges" comparison because what kicked this off was someone saying how they couldn't do the things on Windows that they could on Linux. And I'm fairly sure they weren't talking about utilizing 96 cores but talking about it as a general environment for the technically minded. I know Ruby is out there and it is OO like Powershell, but the comparison is based on what the OS has. I honestly don't know anybody who has a Ruby shell / uses Interactive Ruby for daily driver of working on Linux. If you want to start a Ruby to Bash comparison though by all means, be my guest.

Code:
Dir["."].select{|x|x.mtime>1.year.ago}.each{|f|File.readlines(f).any?{|l| puts "#{f} #{f.mtime}" if l.include?("myString")}}

Frankly, whilst I've never used Ruby that looks rather clunky. It's opening every file and reading every line and whilst low-level that is what grep would do, grep is a compiled binary. So are cmdlets. The biggest difference being that cmdlets are derived from a common base class so they all share some uniformity on which Powershell can build. But in either case, both grep and String-Search are compiled code operating, at an educated guess, a lot more quickly than a Ruby script. Your actual "orange to oranges" comparison to use your phrase wouldn't be the above at all. It would be Ruby code that called out to the same cmdlets as Powershell. Or in the case of Linux, called the grep and find executables. In the case of the former you're not really able to do anything different than you could just scripting it in Bash, just some easier syntax for things like comparisons, etc. In the case of Powershell, you're not really changing anything, it's just something else that handles objects between binaries only with extra steps.

Also, again whilst I've never programmed in Ruby that looks wrong to me. I think that writes out multiple instances per file if it has the pattern occur on multiple lines.

There's probably some idiomatic way to do it more compactly, and probably a better language for compact representation. But this is closer to an oranges vs oranges comparison. "*.*" as a search query on Unix will not return files unless there is a period in them. Here, you'll completely avoid any of the ridiculously edge-case bloat that is an "object pipeline".
I don't know why you consider this either an edge case or to be bloat. For edge case, grep and find have to be two of the most used GNU tools there are. Most people in this thread probably use them all the time - I know I do. For "bloat", both grep and the Powershell equivalents will be more performant than the Ruby and object pipelining is not "bloat" if as a consequence it runs faster than not having it. Filesystem and Network accesses are the big blockers. By using file objects stored in memory you're not "bloating" things, you're reducing the number of blocking operations. In any case, that's not really the key takeaway here. The point is that it leads to simpler and more consistent syntax and greater modularity. Text mangling is inherently fragile and requires greater awareness on both sides of a pipeline as to what the other is doing. Modularity is a key part of modern programming paradigms. It's not "bloat" to have that. If it were maybe we should all give up on High Level languages and program in C at most.


In practice, in bash, I fully decompose each of these steps: gen a filename list with find. Look at it. Gen a list of grepped files that match your query. Look at it. Gen a report with your formatting from those files. Look at it. Submit. You're not composing the steps into a pipeline unless you're trying to save a few seconds when you hit this problem again.
I don't know why you would do that. You're just storing intermediary steps in variables? Like creating an array of files? Or are you saying you're just eyeballing this as a human and copying and pasting matches you're interested in to the next command? Either way, I don't think you're really getting the use case. Or that this is really just an illustrative example of a plausible routine task because there's no point in writing an epic scripting example to illustrate something that can be illustrated more clearly on a single line. Again, I'm comparing the capabilities of Bash and Powershell. No sense in creating an arbitrarily complex scenario for the sake of it. Especially when one of the points is to show how complex Bash gets for even simple scenarios.

The notion of "exception handling" here is absurd. Your exception handling is you looking at the results to see if this looks sane
Okay, see that is what makes me think you're just eyeballing and copying and pasting things from one command to the next. Pipelines are a basic element of Bash so I don't know why you wouldn't pipe something from find to another command. You're not seriously going to create an array of file names and then unpack that for the sake of avoiding a | are you?

Now as to Exception handling, if I have to explain to you the value in Exceptions then we're going to have to go back down to remedial programming. Sysadmins build all sorts of scripts for doing common tasks, for logging and analysis, for anything really. Bro, do you even cron? And once you start writing scripts that have the potential for something to go wrong - which any script that does anything useful can do - then exception handling is an asset. And Powershell supports it.

But if you're really all-in on the OOP model, the Ruby solution will manage exceptions for you.
You're not understanding one of the main points here which is that the entire Windows OS exposes itself as objects. It's an Object Oriented environment. And why would I use Ruby over Powershell? Not that this actually is relevant to a comparison between Powershell and Bash so far as I can see. Unless you want to start a Ruby to Bash comparison as well!

Of course breaking filenames with : in them in the process.
Happy for you to use your version of
grep -lr mystring mydir | xargs -I ^ bash -c '[[ ^ -nt /tmp/365 ]] && stat -c %n\ %z ^ '

As the basis for comparison if you prefer. Said that already. But unclear on what you think that changes in my comparisons. If you're trying to avoid pipelines then I think you're supporting my point about Bash pipelines more than refuting it. (EDIT: Though that doesn't do the same as my example.)
 
Last edited:
If you're searching for files often enough that the tiny performance differences are meaningful, since they're all I/O bound, it's time for an indexer or a database not a hacked together shell pipeline.

Also, I see I have failed the first rule of the Internet:
800px-Forbidden-151987.svg.png
 
This only works if management are willing to actually enforce filtering and you are actually willing to endure another year of sixty hour weeks because they can't find anybody who can pass the filtering. What I actually see in IT is not "this is a complex job so only smart people get hired" but rather "people get hired and the job too complex for them". You know I'm right.bbcbc
My job would be easier as would I suspect many others if they filtered more, not less
 
If you're searching for files often enough that the tiny performance differences are meaningful, since they're all I/O bound, it's time for an indexer or a database not a hacked together shell pipeline.
It's an illustration of principle first off as stated many times. Secondly, there are many scenarios where you search through large volumes of files but it hardly matters because I was just showing, amongst several, a consequence of being forced to pass information textually rather than being able to natively pass objects. The latter is inherently more capable (text is a subset of object) and it fits well with the target OS.

Also, I see I have failed the first rule of the Internet:
View attachment 5823879
In no way was I trolling. I naively thought that people in this thread might be interested in discussing programming paradigms and their relationship to the tools of an OS and the different nature of the OSs that feed into it. It's an fun topic to me. It also came about because someone off-handedly remarked about how they couldn't do the same things on Windows as they could on Linux so I span up a quick and simple scenario for something fairly common on Linux - searching recursively through files - as a basis to explore that. I also asked them for their own examples.

Instead, I just got a bunch of "LOL - wall of text" type responses, people writing 'equivalents' to my examples to show mine were presumably strawmen yet that didn't do the same thing (yours included). And hostility. Juvenile digs about "lol, you work at a company that uses Windows" and a number of incorrect accusations about my background. This has done nothing but to reinforce every stereotype about Linux users. Happily I know it's not always the case because I am one myself.

Pardon me for trying to have a discussion I found interesting. Anybody not interested in discussing it was welcome to just ignore it. But I'm done. For what it's worth, it's possible to discuss Linux in the Windows thread with a great deal less hostility. I'll leave you all to your thread.
 
Big part of why I learned Python was to get away from having to learn how to pipe things through grep and awk. I acknowledge powershell is strong, but the syntax scares me of it, and Python is pretty strong too, not super slow to write scripts in, and I’m already good at it, so I see no reason to learn something else. Especially considering powershell is unlikely to ever make it to macOS, which is what I prefer to work in anyway.
 
Back