Is Windows 11 worth it?

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
I tried it. Was instantly pissed off by how 'app-ified' everything had become - rounded corners, weird docking thing at the bottom, changing the UX to better suit those on a hand-held device etc. plus the start menu didn't let you create little folders for your shit which made me ragequit back into my dual-booting of Windows 10 and Debian.

Also found something funny: when selecting the language to continue the setup in, US English (or as I like to call it, simplified English) had the prompt 'Continue in selected language?', while UK English had 'Continue in the selected language?' as if the difference had any impact at all.
 
Chromebooks have made linux laptops a reality. I wouldn't bother with anything else unless I knew it was compatible.

Linux has a learning curve, for sure. But once you learn it, you know it and it's so much easier nowadays because there's literal copy and paste walkthroughs instead of some grognard telling you to RTFM. The control you get over the OS is second to none. I tried getting into ubuntu like 6 times but it blows. It wasn't until I tried Debian and Gallium on the right hardware that I was convinced it was a feasible OS.

Compared to 10 years ago the driver issues are few and far between.
 
I've had incredible little trouble with hardware in general in linux in the last ten-ish years. There's the occasional weird hardware gizmo like touchscreen or fingerprint reader that either needs acrobatics (step one: extract firmware blobs out of windows drivers etc.) or will flat out not work (correctly) because the manufacturer is more or less openly hostile to the idea of open source. (hi nvidia!)

In practice that means - if you buy some device with more "exotic" hardware (Usually mobile devices like tablets and notebooks, x86 desktop support with standard hardware usually is always a given these days) do your research first to check if all you want supported is actually supported. If you can't find a mention err on the side of not supported. Quite sadly, support for what's commonly called "entry level" (read: cheap) hardware usually moves at a glacial pace if it ever happens at all, most developers like the more high-end hardware, that's also usually the hardware that actually does adhere to common standards properly and doesn't have funky firmware that breaks things, so it's also kinda understandable. Windows is usually much better here in including kludges for broken firmware, mostly simply because the developers there usually have a direct wire to the manufacturer, or the manufacturer dropped money on developing windows drivers.

Avoid nvidia. AMD GPU support is perfectly fine these days and the opensource drivers are actually the better ones, and worked on by AMD itself. (This recommendation actually used to switch around every few years, sometimes nvidia drivers being the better ones, with the in-kernel AMDGPU drivers, and nvidias clear unwillingness to support a proper open source effort, these times probably won't come back)

AMD CPU support for every little feature is kinda slow so make sure that spanking brand new CPU that just came out is actually supported properly already. Intel almost always is very well supported and intel goes to great lengths to even add support for the more obscure features in it's hardware to the linux kernel. It's a pity they're such a shady company otherwise. ARM, although often used in the linux-based Android world, is really hit-and-miss, but even the better supported SoCs are usually not all that well supported so never actually expect to get that list of fancy features that SBC was advertised with fully working. It's because the companies around ARM SoCs only use the linux kernel as a free framework for their SoC but don't care about contributing back and usually hack up old kernels in really incompatible ways to mainline development.

Another problem is also that a lot of distros stick with old kernel versions, often bordering on the rancid. While it's true that some old kernel versions are still officially supported, they usually only get important fixes (sometimes "eventually") and no new features. Distros using old kernels has historical reasons as using the latest stable kernel used to be kind of russian roulette in most of linux' lifetime. IMO these times are mostly over and you can usually trust the newest stable kernel to be good and bring worthwhile improvements. (it's still not a bad idea to maybe wait 2-3 minor versions in before upgrading, in case something major has broken and it somehow slipped past the developers) Most heavy problems in these areas you can experience with some distros usually originate in the distro maintainers patching way-too-old kernels with backported stuff. These distro maintainers are usually not forthcoming with admitting this.

Even as somebody who used linux exclusively for over a decade now, I can tell you that hardware on average in linux is usually never quite as performant or energy efficient as in windows and that's a definitive downside. Usually that's more than set off by all the bloat windows comes with though, burning up those extra cpu cycles (and then some) it inherently gets out of the same equipment.
For anyone considering switching, don't let already having Nvidia scare you away from Linux. I won't argue credentials, or right vs wrong in this regard, but I had already ordered my GPU before making the full switch and despite not even registering properly in fetch for the first couple weeks, my 3080 worked without a hitch and has never given me an issue.

AMD is far more FOSS friendly, and I get why it'd be preferred is on the rise, but you're not going to wind up with an nonfunctional brick if you're already on Nvidia. Arch based systems make it insanely easy to maintain the drivers, they're just shitty for VR and proprietary. When I go to upgrade in a billion years, I'll definitely be doing research, though and might switch to AMD at that point, if we're still in this same boat.
 
For anyone considering switching, don't let already having Nvidia scare you away from Linux. I won't argue credentials, or right vs wrong in this regard, but I had already ordered my GPU before making the full switch and despite not even registering properly in fetch for the first couple weeks, my 3080 worked without a hitch and has never given me an issue.

AMD is far more FOSS friendly, and I get why it'd be preferred is on the rise, but you're not going to wind up with an nonfunctional brick if you're already on Nvidia. Arch based systems make it insanely easy to maintain the drivers, they're just shitty for VR and proprietary. When I go to upgrade in a billion years, I'll definitely be doing research, though and might switch to AMD at that point, if we're still in this same boat.
I've been using Nvidia on Linux for years because the drivers are just superior. I know that for the kernel hackers the Nvidia blob is a complete pain in the ass to deal with, but for the end user you get the best performance. I just found out yesterday that Elden Ring runs on Linux, so I installed it on my System76 Oryx Pro (Arch Linux) and it runs better on that than it does on my Asus g14 Zephyrus running Windows 11. The hardware on the Oryx pro is superior, but I had assumed there would have been more of a performance hit because Steam+Linux.

I specifically only buy Nvidia because of their Linux support.

As for the OP, Windows 11 is what it is. I have a single machine running it because of some applications I use that are Mac/Windows only application, and I'm not a macfag. I find some aspects of 11 better than 10, and some aspects more annoying. But fundamentally, it's just an updated 10 with UI improvements and extra clicks added to get to the better actions on the right click menu of applications. If you do not need to specifically run applications that are Windows/Mac native (and run like dogshit on Wine), then by all means make the switch to Linux.
 
I've been using Nvidia on Linux for years because the drivers are just superior. I know that for the kernel hackers the Nvidia blob is a complete pain in the ass to deal with, but for the end user you get the best performance. I just found out yesterday that Elden Ring runs on Linux, so I installed it on my System76 Oryx Pro (Arch Linux) and it runs better on that than it does on my Asus g14 Zephyrus running Windows 11. The hardware on the Oryx pro is superior, but I had assumed there would have been more of a performance hit because Steam+Linux.

I specifically only buy Nvidia because of their Linux support.

As for the OP, Windows 11 is what it is. I have a single machine running it because of some applications I use that are Mac/Windows only application, and I'm not a macfag. I find some aspects of 11 better than 10, and some aspects more annoying. But fundamentally, it's just an updated 10 with UI improvements and extra clicks added to get to the better actions on the right click menu of applications. If you do not need to specifically run applications that are Windows/Mac native (and run like dogshit on Wine), then by all means make the switch to Linux.
I've not been able to compare, but I've literally had no issues with Nvidia. I thought maybe it was shit I just didn't know about, but when I did deeper research into why people claim AMD is better, I mostly just saw people repeating that AMD was superior because it has open source drivers without any reasons as to how it's better. There are a lot of references to both VR and screentearing with Nvidia, but I've not experienced either so, I'm good with it.

And yeah, Elden Ring was running day 1 on Linux AND had less stuttering than Windows, which was hilarious. Valve made it basically a flagship for their Steam Deck and it shows with how smooth it was.
 
I've not been able to compare, but I've literally had no issues with Nvidia. I thought maybe it was shit I just didn't know about, but when I did deeper research into why people claim AMD is better, I mostly just saw people repeating that AMD was superior because it has open source drivers without any reasons as to how it's better. There are a lot of references to both VR and screentearing with Nvidia, but I've not experienced either so, I'm good with it.
I started using Linux a couple of decades ago, and my frustration with ATi drivers pushed me to Nvidia. Nvidia just works beautifully. You are exactly right about why many FOSS people like AMD, the open source AMD drivers are definitely better than the Nvidia open source drivers (nouveau). But why even use the nouveau drivers? Some like the seemless integration with the framebuffer and the use of Wyland, but for me I want the performance. So X.org & Nvidia's closed source Linux drivers are the best.

I'm not a FOSS purist, I don't play the recorder and eat foot skin on stage. I am an absolute advocate for Linux as I think it is the superior option for many people, but that's from a tech and cost perspective, not an ideological one.
 
I started using Linux a couple of decades ago, and my frustration with ATi drivers pushed me to Nvidia. Nvidia just works beautifully. You are exactly right about why many FOSS people like AMD, the open source AMD drivers are definitely better than the Nvidia open source drivers (nouveau). But why even use the nouveau drivers? Some like the seemless integration with the framebuffer and the use of Wyland, but for me I want the performance. So X.org & Nvidia's closed source Linux drivers are the best.

I'm not a FOSS purist, I don't play the recorder and eat foot skin on stage. I am an absolute advocate for Linux as I think it is the superior option for many people, but that's from a tech and cost perspective, not an ideological one.
Wayland was the other one I forgot to mention, but I don't really feel the need to change what's working. I didn't even bother asking people what exactly I'm supposed to be missing because in my time learning, everyone's just repeated the same couple of things that don't seem to have any effect on me.

Good to see I'm not just retarded and don't get it.
 
Protip; If you want Linux to work well with your hardware, buy your hardware from companies that actually support Linux.
but thats just really stupid if linux wants to be used by everyone and be mainstream why should people have to buy whole new computers just to run an OS im not gonna upgrade to an intel motherboard because some dude on a thread said intel works best on linux just for me to do it and still be disappointed.
 
but thats just really stupid if linux wants to be used by everyone and be mainstream why should people have to buy whole new computers just to run an OS im not gonna upgrade to an intel motherboard because some dude on a thread said intel works best on linux just for me to do it and still be disappointed.
If Linux wants to be used by everyone and be mainstream than it needs to come preinstalled by the OEM.

Normies don't install Windows on their computers either, they're just using what comes with it.
 
If Linux wants to be used by everyone and be mainstream than it needs to come preinstalled by the OEM.

Normies don't install Windows on their computers either, they're just using what comes with it.
This. The vast majority of PC users don't want to go through process of installing an OS and setting it up themselves. They want to buy a PC, turn it on and just start using it. If they sold Linux PCs at Walmart or something, people probably would buy them if they were cheaper Windows alternatives. Would tons of people buy them? Probably not, but that's the only way I can think you would get a normie to buy something that doesn't have Windows or MacOS.

Working in computer repair has taught me that even the most basic PC commands are foreign to an alarmingly large number of people so it's not a surprise that doing something as simple as booting to a USB drive is like asking them to perform open heart surgery.
 
W11 has the usual teething issues. There's no compelling reason to upgrade from my understanding. W10 had a lot of stuff under the hood improved from 7 that would breathe some new life into older machines, especially with a clean install. With W11 being limited to TPM machines it's a POA to upgrade an older system anyways and probably isn't even possible. On newer systems there's weird UI and driver issues, as expected.

Working in computer repair has taught me that even the most basic PC commands are foreign to an alarmingly large number of people so it's not a surprise that doing something as simple as booting to a USB drive is like asking them to perform open heart surgery.

Given the million different key combinations, button mashes and other general hassles that even a skilled operator finds frustration in -- I don't fucking blame them one bit. If you've spent even 2 minutes trying to unlock a fast booting ultrabook into the bios and then another minute finding the boot setting -- you've already spent three minutes too long. I totally get the "just fucking work" attitude consumers have with computers.

@byuu I believe Walmart actually tried that with OEM installs of Linux on cheapish laptops back around 2008. It was an utter fail IIRC.
 
Last edited:
I think Windows should be helping body positive women spend less time on the computer, don't you?

FQ-30mjaIAAreUs.jpg
 
@byuu I believe Walmart actually tried that with OEM installs of Linux on cheapish laptops back around 2008. It was an utter fail IIRC.
Sounds like Xandros on Asus machines.
Xandros was "Linux" but with a terrible UI. The type of thing that it seems would serve only to turn new users off Linux.

Bought one in 2008. Used Xandros for about 1 week before installing DamnSmallLinux on it. It lasted until 2017.
 
And yeah, Elden Ring was running day 1 on Linux AND had less stuttering than Windows, which was hilarious. Valve made it basically a flagship for their Steam Deck and it shows with how smooth it was.
Apparently Elden Ring was compiling shaders while people were playing, causing the game to lag. Valve changed the settings so that all shaders are installed prior to playing the game, causing the lag to not happen.
 
Windows 10 shipped in a broken state, but it isn't now. It's end of life is still years away, so why would you even consider a switch to Windows 11? When the time does arrive, ricers will already have figured out how to make it look like the previous versions or at least polish it enough to be usable. Since you haven't mentioned any technical reason to do so, e.g. a latest GPU that works best on 11 because of drivers, I'd say no. If it ain't broken, don't fix it. Trying to turn a different OS into Windows won't lead to any good results either.
 
  • Like
Reactions: Judge Dredd
windows 11 is broken garbage that m$ decided to shit out just because 10x didn't do well
here is a list of problems that have appeared for all the time i've upgraded to 11:
  • taskbar will not show up on the second monitor
  • no drag n drop on the taskbar without third party software
  • right clicking on the windows icon will crash explorer
  • some windows will lag when you drag because of the ribbon ui
  • when you drag windows across monitors explorer might crash
 
Back