The Linux Thread - The Autist's OS of Choice

I've been having FPS drops while playing games when I get a notifications. Does this happen to anyone else on KDE and arch?

For best performance, I'd recommend running your games in a really bare bones Xmonad setup, the actual problem is almost certainly due to interactions between the game, wine, the window manager, and the compositor. If you don't *have* a compositor, and your window manager is the most minimal possible, that generally clears up any issues like that, though it might be inconvenient. Starting out with no config file and launching with xinitrc/startx should work, but you might want to tweak it.

It's pretty hard-core stuff though, so you could skip it and try to deal with what you have. One question though, does the FPS pick back up after the notification goes away, or does it stay choppy?
 
If vidya is what you want to do your best bet is honestly either to dualboot or setup a second desktop running headless and use something like steam streaming to play anything that doesn't run natively.

There's also a third way of virtualization with GPU passthrough. If you plan a system for it, specifically buy a CPU with iGPU in addition to your dedicated GPU, this works really well and you get almost "bare metal" performance. The trick is that you pass your dedicated GPU through to the Windows VM which gets to access the hardware exclusively. You can also pass through your onboard audio or an USB hub. There's some gotcha with nvidia cards IIRC (that can be worked around somehow) but it works with AMD cards. You can also pull it off with only one GPU and run the Linux host headless and stream the VM to another system, I had that setup for a while. (my goal was to isolate the noisy and heat-producing gaming computer with all the fans in a room far away from me, but still have a powerful Linux server)

All you need is a properly configured linux kernel, a second machine and qemu. With such a VM setup you can do nice things like isolate the VM from the internet and most your data and such, which is nice if you are concerned about phoning-home software or malware embedded in pirated games or something. If then there's something up with your VM just nuke it and restore. If you go down that route make super sure though you have the proper mainboard with non-broken firmware. There's boards that ought to be technically able do it but can't because the manufacturer fucked up and has no intent of fixing because it's such a niche usage scenario. Also yes, you can theoretically add additional GPUs and build a game server who runs OSes and games for several people at once, that would need to be one beefy computer tho.

Theoretically you should also be able to reinitialize the card for usage with the Linux host after the VM shut down (and therefore pass the card back and forth between VM and host for using) but in practice when I did this a few years ago all it did was make the machine crash. I don't know if they fixed it or if it was a quirk of my hardware or something. GPUs are complicated devices.

I don't bother with all that anymore because WINE got really, really good and all I ever want to play runs just fine directly in linux. Problems usually are limited to some .dll dependency missing, which can be fixed by using the winetricks script to install it. There's the rare case of one piece of software simply not running or not running correctly no matter what you do but at least for me (YMMV obviously) this has become exceedingly rare in the last few years. Lots of games also have native Linux versions nowadays. Isolation via network namespaces and different users is probably always a good idea when running proprietary software and you don't need a VM to do that. There's also VirtualGL and plain X11 forwarding if you want to run the software natively in linux on a (beefier) machine on the network and have the stuff rendered to your local workstation.
 
Last edited:
There's also a third way of virtualization with GPU passthrough. If you plan a system for it, specifically buy a CPU with iGPU in addition to your dedicated GPU, this works really well and you get almost "bare metal" performance. The trick is that you pass your dedicated GPU through to the Windows VM which gets to access the hardware exclusively. You can also pass through your onboard audio or an USB hub. There's some gotcha with nvidia cards IIRC (that can be worked around somehow) but it works with AMD cards. You can also pull it off with only one GPU and run the Linux host headless and stream the VM to another system, I had that setup for a while. (my goal was to isolate the noisy and heat-producing gaming computer with all the fans in a room far away from me, but still have a powerful Linux server)

All you need is a properly configured linux kernel, a second machine and qemu. With such a VM setup you can do nice things like isolate the VM from the internet and most your data and such, which is nice if you are concerned about phoning-home software or malware embedded in pirated games or something. If then there's something up with your VM just nuke it and restore. If you go down that route make super sure though you have the proper mainboard with non-broken firmware. There's boards that ought to be technically able do it but can't because the manufacturer fucked up and has no intent of fixing because it's such a niche usage scenario. Also yes, you can theoretically add additional GPUs and build a game server who runs OSes and games for several people at once, that would need to be one beefy computer tho.

Theoretically you should also be able to reinitialize the card for usage with the Linux host after the VM shut down (and therefore pass the card back and forth between VM and host for using) but in practice when I did this a few years ago all it did was make the machine crash. I don't know if they fixed it or if it was a quirk of my hardware or something. GPUs are complicated devices.

I don't bother with all that anymore because WINE got really, really good and all I ever want to play runs just fine directly in linux. Lots of games also have native Linux versions nowadays. Isolation via network namespaces and different users is probably always a good idea when running proprietary software and you don't need a VM to do that. There's also VirtualGL and plain X11 forwarding if you want to run the software natively in linux on a (beefier) machine on the network and have the stuff rendered to your local workstation.
All true and imho a major headache, even moreso because in my experience passthru only seems to work with VMware. Honestly I think it'd be easier to simply setup your network so that the windows node is heavily firewalled by a router or layer 3 switch and laugh as it panics trying to phone home. Windows is so insecure that you'd probably want to firewall it off anyway.
 
For best performance, I'd recommend running your games in a really bare bones Xmonad setup, the actual problem is almost certainly due to interactions between the game, wine, the window manager, and the compositor. If you don't *have* a compositor, and your window manager is the most minimal possible, that generally clears up any issues like that, though it might be inconvenient. Starting out with no config file and launching with xinitrc/startx should work, but you might want to tweak it.

It's pretty hard-core stuff though, so you could skip it and try to deal with what you have. One question though, does the FPS pick back up after the notification goes away, or does it stay choppy?
I know how to install and use window managers. I just think they suit laptops better than desktops and yes the FPS does pick up after the notification.
 
  • Like
Reactions: Vecr
Pretty insightful take:

The two most intriguing developments in the recent evolution of the Microsoft Windows operating system are Windows System for Linux (WSL) and the porting of their Microsoft Edge browser to Ubuntu.

For those of you not keeping up, WSL allows unmodified Linux binaries to run under Windows 10. No emulation, no shim layer, they just load and go.

Microsoft developers are now landing features in the Linux kernel to improve WSL. And that points in a fascinating technical direction. To understand why, we need to notice how Microsoft’s revenue stream has changed since the launch of its cloud service in 2010.

Ten years later, Azure makes Microsoft most of its money. The Windows monopoly has become a sideshow, with sales of conventional desktop PCs (the only market it dominates) declining. Accordingly, the return on investment of spending on Windows development is falling. As PC volume sales continue to fall off , it’s inevitably going to stop being a profit center and turn into a drag on the business.

Looked at from the point of view of cold-blooded profit maximization, this means continuing Windows development is a thing Microsoft would prefer not to be doing. Instead, they’d do better putting more capital investment into Azure – which is widely rumored to be running more Linux instances than Windows these days.

Our third ingredient is Proton. Proton is the emulation layer that allows Windows games distributed on Steam to run over Linux. It’s not perfect yet, but it’s getting close. I myself use it to play World of Warships on the Great Beast.

The thing about games is that they are the most demanding possible stress test for a Windows emulation layer, much more so than business software. We may already be at the point where Proton-like technology is entirely good enough to run Windows business software over Linux. If not, we will be soon.

So, you’re a Microsoft corporate strategist. What’s the profit-maximizing path forward given all these factors?

It’s this: Microsoft Windows becomes a Proton-like emulation layer over a Linux kernel, with the layer getting thinner over time as more of the support lands in the mainline kernel sources. The economic motive is that Microsoft sheds an ever-larger fraction of its development costs as less and less has to be done in-house.

If you think this is fantasy, think again. The best evidence that it’s already the plan is that Microsoft has already ported Edge to run under Linux. There is only one way that makes any sense, and that is as a trial run for freeing the rest of the Windows utility suite from depending on any emulation layer.

So, the end state this all points at is: New Windows is mostly a Linux kernel, there’s an old-Windows emulation over it, but Edge and the rest of the Windows user-land utilities don’t use the emulation. The emulation layer is there for games and other legacy third-party software.

Economic pressure will be on Microsoft to deprecate the emulation layer. Partly because it’s entirely a cost center. Partly because they want to reduce the complexity cost of running Azure. Every increment of Windows/Linux convergence helps with that – reduces administration and the expected volume of support traffic.

Eventually, Microsoft announces upcoming end-of-life on the Windows emulation. The OS itself , and its userland tools, has for some time already been Linux underneath a carefully preserved old-Windows UI. Third-party software providers stop shipping Windows binaries in favor of ELF binaries with a pure Linux API…

…and Linux finally wins the desktop wars, not by displacing Windows but by co-opting it. Perhaps this is always how it had to be.

:thinking:
 
I fucked up my home folder for my laptop. I'm cringing real hard right now. Nothing of any super importances was lost. The most important data, to me, was my Docs folder. Luckily I had a backup from the summer on my desktop. So that'll help a bit. Working to backup the partition and hopefully recover my files.

Has anyone had any luck recovering from an ext4 partition?
 
I fucked up my home folder for my laptop. I'm cringing real hard right now. Nothing of any super importances was lost. The most important data, to me, was my Docs folder. Luckily I had a backup from the summer on my desktop. So that'll help a bit. Working to backup the partition and hopefully recover my files.

Has anyone had any luck recovering from an ext4 partition?
I have recovered a very small number of files from a dying disk with an ext2 or ext3 partition, a long time ago. Don't remember what I used. I think testdisk. That was a long time ago though, it seems like the state of the art has moved.

There is an Arch Linux guide which seems universally applicable here:
Which recommends several ext4 relevant utilities.

The most important thing you can do is stop doing anything that will write to the drive. Obviously, you're fine if you're just making an image of the raw disk to a file on another drive with dd or something, but not if you're booting into the system and potentially overwriting data. Then, if you can boot from a USB recovery image which has these tools like Ext4Magic or Extundelete and use that to recover files to another external drive, it seems like that could potentially save your ass.

It looks like Ext4Magic is in the Ubuntu repositiories, so it should be as simple as booting from an Ubuntu liveCD in 'don't touch my existing system' mode, using apt-get to install ext4magic, and then following the instructions to recover files to another USB key or hard drive.
 
Last edited:
  • Informative
Reactions: tehpope
Fedora is best distro, fight me on this. Pro tip: you can't.
Imagine having to use an unofficial repo to get a media player. If if weren't for this fact, it might actually be a good distro.

BTRFS is good now, pass it on! ZFS is a bloated monolith. I use OpenSUSE Tumbleweed on my NAS with a LUKS-'crypted BTRFS RAID10-ish array.

The best WM is i3 with DWM-style keybinds. A redacted version of my config file is attached.

BTW I run Arch on all of my recent AMD64/x86_64 PCs and on some of my Raspberries. Void PPC is becoming my favorite distro for PPC Macs, when I'm not yelling on MacPorts on Mac OS X Tiger/Leopard. Gentoo currently is the best distro on SPARC, but I may take a look at T2 Linux in the future, as well...
 

Attachments

Last edited by a moderator:
>log onto steam
>click settings
>tell it to use proton
>get wine from package manager

there, you can now run 90% of Windows games on Linux.

I always hear about how WMs are better and less bloated than DEs but in my xfce it uses 300MB at idle and when I open task manager it's around 0% CPU usage, is the WMpill real or another reddit meme?
 
  • Winner
Reactions: gooseberry-picker
xfce is a DE and I wouldn't consider it lightweight. It used to be, not anymore. I won't make any recommendations as there's tons of WMs both stacking and tiling (or a combination of the two) that there's everything for every taste and how lightweight everything is in the end depends also on the software combinations you end up using and their dependencies. I also wouldn't let me discourage that some WMs are only rarely updated. That's actually a good thing, not a bad thing. It usually means that the programmers aren't falling for the "everything and the kitchensink" model of developing, which has ruined a lot of initially fine software but instead focus on preserving the software as is and just fix the bugs.
 
Try IceWM and PCManFM if you don't want a full de but are scared of setting up your own environment. IceWM is a window manager that comes with its own taskbar, menu, notification area, and wallpaper solution. PCManFM is a full featured gui file manager and can also supply desktop icons. They are both well documented, come with sensible defaults, and are easy to configure.
 
BTRFS is good now, pass it on! ZFS is a bloated monolith. I use OpenSUSE Tumbleweed on my NAS with a LUKS-'crypted BTRFS RAID10-ish array.

The best WM is i3 with DWM-style keybinds. A redacted version of my config file is attached.
Hans? I thought your parole had been denied.
I always hear about how WMs are better and less bloated than DEs but in my xfce it uses 300MB at idle and when I open task manager it's around 0% CPU usage, is the WMpill real or another reddit meme?
Ironically, awesome is now up to 1.7 gb of virtual memory and 1.1 real on one of my machines, but it has been running for 90 days. Usually would be well, well under 100mb and it isn't even that lightweight. Will have to keep an eye on whether there's a steady leak there once I start a new session.

There is always a window manager in play anyway if you're using a desktop environment (there always is, with X, if you're doing anything except running a single xterm fullscreen or something, which would be pretty autistic). Unless your primary activity is moving folders round on your desktop I can't see much point to running a full DE.
 
I believe the logic goes something like this: In order for the software to remain free as in freedom(You are free to know what the program does, modify what it does and distriubute your new version), you have to be able to look at the code and know what it's doing. Therefore each piece of software and its code should be kept concise and simple and essentially serve one purpose, enabling any individual with the know-how to check out the code and determine if the software is doing anything they don't want it to do. Linux distros are generally built in this way. The kernel(Linux), the window manager, the web browser, the compilers(gnu) etc are all separate pieces of software made by separate groups and the code for each is available to anyone who wants it, unlike say, Windows where everything is a part of Windows by default and everything is made by Microsoft and you cannot look at the code.

Systemd has evolved to the point where it manages most of your distro's essential services like network connections and it has gotten so big and complicated it's difficult for a single individual or even group of individuals to take a look at the code and make sure it isn't doing anything untoward, such as, say, creating a backdoor for the NSA. This goes against the traditional Linux/Open Source/GNU philosophy.

On the flipside, systemd does so much that the average end user has to do a lot less fucking around to make their computer work. I can remember using Ubuntu back in the early 2000s, for example, and even though it was aimed at novices, there were often things that just didn't work and I'd have to scour the internet looking for a way to get my wireless connection working. That's something I don't really have to do today thanks to systemd managing things like network connections.

Today, most distros are totally dependent on systemd and many pieces of open source software rely on it, causing more controversy because it gives its developers a disproportionate amount of control over the open source community. What if say, the systemd developers decided that the developer of another piece of open source software which relies on systemd is problematic, and thus they purposely break compatibility? Who would stop them? Traditionally you'd just fork the software in that situation, but systemd is so big now, you'd need a whole team of people to do that.

Someone who knows more can correct or add, but I think I have the major points in there.
Just catching up on this thread.

I think people's problems with systemd go beyond the technical side and into the political side as well.

The guy that created it, Lennart Poettering is kind of a douchebag that thinks his shit don't stink. He is employed by Red Hat.

Another important name involved with systemd is Kay Sievers, another guy that seems to think his code is perfect and blames bugs on the kernel. Torvalds got so pissed at this guy that in 2014, he essentially banned him from submitting kernel patches because he wouldn't fix bugs in his code that the kernel then had to work around. He is also employed by Red Hat.

Red Hat employees these guys that are making all these huge programs. The fear is that Red Hat is slowly taking over Linux as a whole and it's going to end up just being another corporate developed OS like Windows.
Once Torvalds gives up the reins to the kernel, or dies, what's going to happen? Who is going to take over ownership of it? Probably Red Hat or someone employed by Red Hat unless Torvalds has a way to prevent that.

I have mixed feelings on systemd. I like that it makes things a little easier to manage and possibly faster, but dislike the feature creep and dislike that it's essentially being developed by a corporate entity.
 
Back