The Linux Thread - The Autist's OS of Choice

Even he admitted it a while ago. I still cannot understand linux user's fervor when it comes to linux. Ok, it's a great kernel, and thanks to Google, Android is the most common Linux Distro so far, despite all of it's flaws, including being somewhat closed source. But on desktop it still isn't good. And all these forkings aren't helping. "install pop!_os or mint no problem" mint died when I added an external drive for some reason and pop os cannot fanthom installing other DE without blowing up and having weird artifacts. "try arcolinux, try this arch based distro", it worked for a week then some updates later right clicking made the system halt to a crawl and loads of windows popped out of nowhere.
"it's good for retro gaming", i tried starting up some old games and while on windows I at least had DXWnd, on linux I had nothing, and half the time the games refused to outright start, even with the proper fix it was still a coin flip. Alt F4 didn't worked nearly all the time, the alt codes were non existing, the locale kept changing at random because I had the system language set to english but used another currency and metric system...
cinnamon may be good, but half the settings are hidden somewhere in root/etc, and the same applies to KDE and Gnome itself, whose Nautilus is a joke. Why can't I manually write the directory i want to go in without having to search up for some extentions that no longer works because those only worked back in Gnome 3? Or some tutorials that aren't valid anymore because the devs changed everything?
It's a great system, but it's not ready at all for general use or for people that don't really want to dive into fixing everything every few seconds, I want my pc to work and to look the way I want it to without getting a neckbeard or asking those gate keepers at StackOverflow or seeing locked forum threads with lots of wiki links that do explain functionalities, but help next to nothing. and all those videos that are keep popping in my home saying that "you shouldn't use linux if you game", videos that are all prior to linux's challenge, and I still don't understand why you hate those videos, it shows what happens to most users that aren't tech savvy enough, lol. He even explained it in the first episode yet you keep getting angry at someone who doesn't know something that YOU know.
For users/consumers to do something in their long term best interest requires a certain level of IQ,the ability to abstract patterns, and a willingness to learn. These are not skills/qualities that a majority possesses nor does it benefit those that build 'structures' taking advantage of those that are lacking in those aspects
User-Centricness should not exclude proper beta testing, and actual implementation of Ease of Use. Not everything is always the consumer's fault, nor everyone wants to tinker with it.
Looks pretty nice.
lol, another OSX UI

year of linux desktop
Finally, after all this time. This year+1 is the year of Linux.
 
Last edited:
Even he admitted it a while ago. I still cannot understand linux user's fervor when it comes to linux. Ok, it's a great kernel, and thanks to Google, Android is the most common Linux Distro so far, despite all of it's flaws, including being somewhat closed source. But on desktop it still isn't good. And all these forkings aren't helping. "install pop!_os or mint no problem" mint died when I added an external drive for some reason and pop os cannot fanthom installing other DE without blowing up and having weird artifacts. "try arcolinux, try this arch based distro", it worked for a week then some updates later right clicking made the system halt to a crawl and loads of windows popped out of nowhere.
"it's good for retro gaming", i tried starting up some old games and while on windows I at least had DXWnd, on linux I had nothing. Alt F4 didn't worked half the time, the alt codes were non existing, the locale kept changing at random because I had the system language set to english but used another currency and metric system...
cinnamon may be good, but half the settings are hidden somewhere in root/etc, and the same applies to KDE and Gnome itself, whose Nautilus is a joke. Why can't I manually write the directory i want to go in without having to search up for some extentions that no longer works because those only worked back in Gnome 3? Or some tutorials that aren't valid anymore because the devs changed everything?
It's a great system, but it's not ready at all for general use or for people that don't really want to dive into fixing everything every few seconds, I want my pc to work and to look the way I want it to without getting a neckbeard or asking those gate keepers at StackOverflow or seeing locked forum threads with lots of wiki links that do explain functionalities, but help next to nothing. and all those videos that are keep popping in my home saying that "you shouldn't use linux if you game", videos that are all prior to linux's challenge, and I still don't understand why you hate those videos, it shows what happens to most users that aren't tech savvy enough, lol. He even explained it in the first episode yet you keep getting angry at someone who doesn't know something that YOU know.
bla bla bla bla bla

Install Gentoo!
 
Void would be absolute perfection but they refuse to package Brave. Fucking Firecucks I swear.

Artix though has been a damn fine replacement and I am happy to see more people recommending it. runit is fast as fuck.

Even LibreWolf just got a proper apt package, what the fuck. But as much as I endorse that browser, it's still part of the duopoly and doesn't solve the bloat circumnavigation issues.
 
I want an easy OS but it had better be configured exactly the way I want!
You appreciate the irony of this statement, don't you? It sucks that hard things aren't easy, sure but hold every OS to your standard. For example look at your DE complaints. Want to change DE in windows? Break into your own computer and risk chronic system instability. Change DE in Mac? I don't think you even can. Change DE in Linux? Just change it bro. That's it. Sometimes it's easy, and for you sometimes it's hard. In Linux the hard things can be hard but at least the hard things are possible.
 

Looks like a pretty comprehensive test of default setting browsers.

tl;dr in 2022, if you want something the most private out of the box, Brave, Librewolf, and sometimes tor browser are your only real options. I was disappointed in how ungoogled performed, though I guess I shouldn't be surprised.
 
  • Informative
Reactions: Wood
I have to admit I rarely care about the GUI in any Lunix situation because if I'm even using it, instead of Windows or MacOS, it's because I want to be doing command line things and not having to screw around with user interfaces. I just want to type things in and have them done immediately without any backtalk.

sudo is cruise control for do what the fuck I told you and shut the fuck up and no do not throw up any windows asking me if I really wanted to do the insane shit I just told you to do.
 
Linux on the desktop is a meme. Linux as a voodoo quasi-terminal OS that you can be more productive than with any desktop-paradigm based computer for certain types of work? Sure. But as a desktop? It's crap. Always has been, always will be. Because it is neither BeOS, Mac OS 9, or AmigaOS.

Great Lisp environment though, if you have it auto-boot to emacs.
 
Linux on the desktop is a meme. Linux as a voodoo quasi-terminal OS that you can be more productive than with any desktop-paradigm based computer for certain types of work? Sure. But as a desktop? It's crap. Always has been, always will be. Because it is neither BeOS, Mac OS 9, or AmigaOS.

Great Lisp environment though, if you have it auto-boot to emacs.
post avant garde baitposting
 
Is there something weird going on with Wine as of late?
Kinda curious about this as well. Lately I've been having an annoying issue with it.

Mono runs flawlessly, but Wine itself seems to not want to open or run anything (including winecfg) at all at random times whenever I boot up the computer. At first I thought this was a DirectX issue since apparently Wine can't play nice with it except for any of the "d3dx9_??" dlls, but even after deleting my .wine folder and working fine for like a day or two it still decided to not open/load. Thankfully the shit im running doesn't actually need DirectX, but it's annoying that I have to keep deleting my .wine folder for it to properly function.

Never had this issue before with any previous distro or wine version.
 
In Ubuntu can I have multiple GPU drivers installed and easily switch between the ‘active’ driver? I recently had issue where the driver I was using seemed to switch and my game no longer work. Real PITA as instead of a nice relaxing playing a game with my missus I spent an evening trying to set up a computer. I ended up breaking the install and just reinstalling a fresh system. Anyway, I’m just wondering if this exists or drivers are something which installed = active and that’s it.

Off-topic but I was enjoying the LTT hate a few weeks ago so I thought I would share an observation here: his wife didn’t take his last name. Sad!
I honestly don't really know how people can stand Windows.
My work laptop randomly changes the power settings to sleep in 1 minute of inactivity. The only way to stop it is by editing the group policy to forbid any changes to the power plan. It seems a common issue online.
 
I honestly don't really know how people can stand Windows.
Either
  1. All the person knows is Windows and can/will not learn how to use anything else.
  2. Uses software that is only supported on Windows
  3. They use it for gaming only
Fun fact I never had an update for Windows 10 break my PC since I used it in 2016 until I switched to Linux awhile ago. I had some fail to install, but it did not break my system. Should I consider myself lucky that never happened?
 
The only Linux distro I ever seriously used privately was gentoo. (no joke) In the '00s compiling times were annoying, yes but you knew what was on your system which was a luxury I didn't have since the AmigaOS/DOS days by then. (XP at the time felt super blackboxed and invasive already for me, although it was harmless compared to Win10) I didn't make the best experiences with "user-friendly" distributions like Ubuntu myself. From what I saw, they sometimes get very basic things wrong like no custom DPI autoconfiguration that respects the screen's DPI making fonts look like shit or very simple things like measures to not have screen tearing in X which just looks terrible and gives a bad first impression, but is actually easy to fix on 95% of the hardware and in almost all cases just involves the simple setting of an X server flag. I'm not sure what the distro maintainers are smoking or if they're not aware that it's small things like this that are super important for a first impression but yes, my first impressions when I looked into such distros were never good. Years of gentoo and gaining knowledge of the actual software that make a modern distro have thought me where to look and how to fix such basic things, people who install Ubuntu for the first time probably don't even know that it's an option to fix them and just think "wow linux is shit, can't even render fonts correctly". In fact, font rendering has never been this good - if it's all set up correctly. (!) (This sometimes leads me to guess that the distro maintainers have no idea about the software they're maintaining which probably often isn't a wrong assumption - there's a lot of circlejerking, politics and -in recent years- general tranny nonsense going on in distros)

Nowadays compile time are harmless, even on e.g. underpowered $100 Gemini Netbooks. Hell, I run gentoo on an Allwinner A20 (ancient ARM chip) with 2 GB. Yes, it doesn't compile firefox or libreoffice but theoretically, it could. It'd just take longer. I've seen worse in the 00s. Regarding constantly having to tweak my configuration - I don't think I've touched e.g. my ALSA config file seriously in 5+ years. I never had to. When I get a new computer I usually just copy all the files to the new harddrive (if there is one) tweak a few things (like kernel config because new hardware) and that's about it. I don't think Windows would be easier, or rather the additional ease would be worth it giving up all that control and privacy.

I guess you either learn all that stuff (which honestly, isn't even that bad if you're willing to invest the time) or you get a more automated experience where you're at the whims of scripts and whatever the people put it together thought is the best for you and learn to live with the problems and imperfections that will cause. Guess can't have it both ways.
 
The only Linux distro I ever seriously used privately was gentoo. (no joke) In the '00s compiling times were annoying, yes but you knew what was on your system which was a luxury I didn't have since the AmigaOS/DOS days by then. (XP at the time felt super blackboxed and invasive already for me, although it was harmless compared to Win10) I didn't make the best experiences with "user-friendly" distributions like Ubuntu myself. From what I saw, they sometimes get very basic things wrong like no custom DPI autoconfiguration that respects the screen's DPI making fonts look like shit or very simple things like measures to not have screen tearing in X which just looks terrible and gives a bad first impression, but is actually easy to fix on 95% of the hardware and in almost all cases just involves the simple setting of an X server flag. I'm not sure what the distro maintainers are smoking or if they're not aware that it's small things like this that are super important for a first impression but yes, my first impressions when I looked into such distros were never good. Years of gentoo and gaining knowledge of the actual software that make a modern distro have thought me where to look and how to fix such basic things, people who install Ubuntu for the first time probably don't even know that it's an option to fix them and just think "wow linux is shit, can't even render fonts correctly". In fact, font rendering has never been this good - if it's all set up correctly. (!) (This sometimes leads me to guess that the distro maintainers have no idea about the software they're maintaining which probably often isn't a wrong assumption - there's a lot of circlejerking, politics and -in recent years- general tranny nonsense going on in distros)

Nowadays compile time are harmless, even on e.g. underpowered $100 Gemini Netbooks. Hell, I run gentoo on an Allwinner A20 (ancient ARM chip) with 2 GB. Yes, it doesn't compile firefox or libreoffice but theoretically, it could. It'd just take longer. I've seen worse in the 00s. Regarding constantly having to tweak my configuration - I don't think I've touched e.g. my ALSA config file seriously in 5+ years. I never had to. When I get a new computer I usually just copy all the files to the new harddrive (if there is one) tweak a few things (like kernel config because new hardware) and that's about it. I don't think Windows would be easier, or rather the additional ease would be worth it giving up all that control and privacy.

I guess you either learn all that stuff (which honestly, isn't even that bad if you're willing to invest the time) or you get a more automated experience where you're at the whims of scripts and whatever the people put it together thought is the best for you and learn to live with the problems and imperfections that will cause. Guess can't have it both ways.
I thought about doing this myself before, but your Gemini Netbook example leaves out a big factor - temperature management. Laptops and netbooks are absolutely terrible at this, compiling anything beefy means thermal throttling and fans going crazy. There's also hardware degradation as a result of running it at high temperatures for extended periods of time, i.e. emerging world. Seems like the best use case is having multiple machines; a PC for cross-compiling hard stuff and things that don't benefit from march=native and everything else that gets binary packages from it. Having a single machine, I can't justify the switch. Right now, whenever I need to compile something, I bring the temperatures down manually by limiting cores and abusing the nice & cpulimit utilities which is straight-up masochism sometimes.
 
  • Agree
Reactions: ditto
In Ubuntu can I have multiple GPU drivers installed and easily switch between the ‘active’ driver? I recently had issue where the driver I was using seemed to switch and my game no longer work. Real PITA as instead of a nice relaxing playing a game with my missus I spent an evening trying to set up a computer. I ended up breaking the install and just reinstalling a fresh system. Anyway, I’m just wondering if this exists or drivers are something which installed = active and that’s it.

Off-topic but I was enjoying the LTT hate a few weeks ago so I thought I would share an observation here: his wife didn’t take his last name. Sad!

My work laptop randomly changes the power settings to sleep in 1 minute of inactivity. The only way to stop it is by editing the group policy to forbid any changes to the power plan. It seems a common issue online.
I used to run a RX580 as the primary display card and a secondary Nvidia card for cuda stuff and it worked under Kubuntu. I'm not sure if it will work under other Ubuntu related distros though.
 
  • Informative
Reactions: AnOminous
I thought about doing this myself before, but your Gemini Netbook example leaves out a big factor - temperature management. Laptops and netbooks are absolutely terrible at this, compiling anything beefy means thermal throttling and fans going crazy. There's also hardware degradation as a result of running it at high temperatures for extended periods of time, i.e. emerging world. Seems like the best use case is having multiple machines; a PC for cross-compiling hard stuff and things that don't benefit from march=native and everything else that gets binary packages from it. Having a single machine, I can't justify the switch. Right now, whenever I need to compile something, I bring the temperatures down manually by limiting cores and abusing the nice & cpulimit utilities which is straight-up masochism sometimes.
I used to have Gentoo running on one of the early Atom netbooks, N270 I believe. emerge -avqDu world could easily take a day and yes, the fan spun up but I had the time as I didn't use the system exclusively. I used alternatively distcc (which would often flat out just not work) and pushing binary packages to it from my main PC but the results were often mixed so often I just let the machine compile all by itself. Not like you need (or should, really) update every day. I don't remember the exact temperatures but they were probably fine else I wouldn't have done it. I stopped using that netbook long before any thermal wear became a concern and if I still had any point using it today, I could probably just dig it out, clean the fan and use it for another ten years. Newer 14nm process stuff like gemini, newer atoms etc. usually comes in small packages and fanless. There's really not enough energy involved to even get to dangerous (for the hardware) temperature ranges. The aforementioned tablet I got for example is a cheap piece of plastic. Now I didn't crack it open but my guess is they probably just stuck a piece of sheet metal on the SoC, an evergreen of the penny shaving engineers club. It doesn't really pass beyond 60C, even under prolonged full load. Maybe if it's 40C in the room and the device is sitting in the sun, I don't know. What you do affect negatively with high temperatures is the length/suceess of boosting. In a bigger device like a Celeron Netbook from a somewhat better company who actually bothered to put a real heatsink on the chip or use the case as heatsink I'd go as far as to say that you'll never run into thermal throttling during a long-term compile. Hell, you might even be able to sustain a perma-boost by undervolting. A lot of these low end devices have perfectly capable SoCs these days that would be a breeze to use for most things, it's mostly that they get crippled by cost saving measures like low/slow RAM (with onboard GPUs RAM speeds have a huge impact) single channel configurations etc.. There's nothing wrong with the chips and they're fine on paper for anything non-gaming related, they usually just get put into a very restrictive working enviroment, gimping their usefulness, or get stuck with too high resolution screens the GPUs weren't made for etc.. Now, a bigger chip that requires lots of cooling in a too small package, I don't know. You often see configurations in these mobile devices that are not very well thought out because they look better on paper.

I used to push binary packages to the A20 from my Ryzen until I thought about it for a minute and realized what a waste of energy that was. Now I just let it compile, for hours if it needs to. PORTAGE_NICENESS and PORTAGE_IONICE_COMMAND and there's no noticeable impact on it's other operations while compiling, even with the linux kernel's default fair CPU scheduler. (used to use a MuQSS patched kernel on it for a while, but the code got kinda stale and last I heard, it's not maintained anymore. For low core count chips like Celerons etc. a scheduler tweaked for 'non-fairness'/responsiveness can make a huge impact on perceived speed in e.g. GUI enviroments. The less a thing the more cores you have. With 4+ cores latency kinda becomes a complicated topic.) The A20 has an actual, real SSD and 2 GB of RAM and rarely compiles for hours with the packages it has installed. The trick is though to put /var/tmp/portage into a tmpfs and then also put swap into zram partitions, not only does this speed things up considerably, it also reduces potential tear on the drive. (more a thing for SD-Cards, a semi-recent SSD you won't destroy this way) The files that get created during such a compile operation are very compressible by default, even 2GB get you far that way. For big packages that won't fit into that you can still define a harddisk portage tempdir via package.env.

I honestly also have sweet spot for these low-performing devices and was always interested in low power computing, even beyond all energy cost and environmental concerns. Getting some newest generation Ryzen to render a website while burning 100W is nothing special. Doing pixel art graphics and some reading on a 3w ARM chip while listening to internet radio and compiling GCC is more interesting.
 
Back