The Linux Thread - The Autist's OS of Choice

  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
after reading and thinking about it. I think im going to uninstall most of my games on windows and then consolidate windows to as few drives as possible.

This way i can make a windows VM and do my 3d modeling in that without needing to reboot.

So for passing a gpu through. I know you can do this with single gpu but will that cause issues? I do not think i can fit another gpu in my case. My main gpu is big and beefy as is.

Also I think its time to finally get around to building a storage PC to have backups on. So ima need to look into that.
 
So is this fugly shit gonna be the permanent logo of KDE now? Or is my browser bugged?

1753541243452.webp
 
Also I think its time to finally get around to building a storage PC to have backups on. So ima need to look into that.
Buy a Chinese N100 NAS board, stick it in a case with lots of HDD cages, buy a bunch of 12TB Toshiba N300 drives (twice as many TB as you actually need plus one, set them up in a ZFS RAIDZ1 pool), install the biggest DDR4 SODIMM you can find for memory, a basic M.2 SSD to store the actual operating system on, then install a decent stable Linux distro, and set up an SMB share. This gives decent performance, and some redundancy. If you also use ZFS on your desktop you can easily set up a systemd service to send incremental backups of your pool, otherwise you can use any number of NAS backup tools. If you want the N100 iGPU is even capable of hosting a jellyfin instance, so you can use this computer as a media server. You’ll want a GPU if you plan on letting others access the jellyfin also, but for a single user the iGPU is sufficient.
 
Buy a Chinese N100 NAS board, stick it in a case with lots of HDD cages, buy a bunch of 12TB Toshiba N300 drives (twice as many TB as you actually need plus one, set them up in a ZFS RAIDZ1 pool), install the biggest DDR4 SODIMM you can find for memory, a basic M.2 SSD to store the actual operating system on, then install a decent stable Linux distro, and set up an SMB share. This gives decent performance, and some redundancy. If you also use ZFS on your desktop you can easily set up a systemd service to send incremental backups of your pool, otherwise you can use any number of NAS backup tools. If you want the N100 iGPU is even capable of hosting a jellyfin instance, so you can use this computer as a media server. You’ll want a GPU if you plan on letting others access the jellyfin also, but for a single user the iGPU is sufficient.
i have a second pc built from old parts. quite a capable one. I was thinking of throwing a bunch of drives in it and then installing a good nas software. I will need to lookup and see if i can still buy the extended hdd cage for this case. I know it existed but the case has been retired afaik.

If i just uninstall all my games i dont have that much to backup tbh. Wouldnt need too many drives. Honestly i could get by with 2 12tb drives with one being for redundancy.

2 12tb drives running in raid 1? so they are just copies of each other? would that work? or should i do something else?

also. the 12tb drives are backordered for like 2 months lol. might get a different size
 
Uh, asckually sweaty I said "debian et al" and was in part referring to MX linux
Yeah that's why you would not use MX Linux. I have not read any Debian upgrade documentation in a decade, unless you count looking to find out what the new release codename is. You just update the release name in the sources.list/sources.list.d files, apt-get update, apt-get upgrade, apt-get dist-upgrade, then restart. If you're a pussy you could back things up beforehand.
Nope. There was no eject. You can see that the iso is still mounted in the console at 36:56. I fucked around in the console for a while trying to see if I could do something that would make the installer finish, but no.
I would try to reproduce it if I didn't just use the net install version every time. How did you manage to get to that point without adding a web mirror?
 
i have a second pc built from old parts. quite a capable one. I was thinking of throwing a bunch of drives in it and then installing a good nas software. I will need to lookup and see if i can still buy the extended hdd cage for this case. I know it existed but the case has been retired afaik.
Sounds like a good plan. The N100 would almost certainly be more power efficient, bear in mind that a computer like this you will leave running 24/7.
If i just uninstall all my games i dont have that much to backup tbh. Wouldnt need too many drives. Honestly i could get by with 2 12tb drives with one being for redundancy.
Yeah, don’t back up games. You can easily redownload them from Steam should you lose them. No need to uninstall them, you can simply exclude those folders in your backup software.
2 12tb drives running in raid 1? so they are just copies of each other? would that work? or should i do something else?
Yep, that’s what that means, and yep, that should work fine. You do want redundancy, so either 2x12 in RAID1 or 3x8 in RAIDZ1 (which is basically the same as RAID5, so two drives storage and one parity) is what I would do.
also. the 12tb drives are backordered for like 2 months lol. might get a different size
I only recommended that size because it was the most storage per money last I checked. You can go bigger or smaller, whichever makes the most sense to you. I do recommend sticking specifically to Enterprise/NAS drives like the N300 though, they’re much higher quality than standard consumer drives.
 
Sounds like a good plan. The N100 would almost certainly be more power efficient, bear in mind that a computer like this you will leave running 24/7.

Yeah, don’t back up games. You can easily redownload them from Steam should you lose them. No need to uninstall them, you can simply exclude those folders in your backup software.

Yep, that’s what that means, and yep, that should work fine. You do want redundancy, so either 2x12 in RAID1 or 3x8 in RAIDZ1 (which is basically the same as RAID5, so two drives storage and one parity) is what I would do.

I only recommended that size because it was the most storage per money last I checked. You can go bigger or smaller, whichever makes the most sense to you. I do recommend sticking specifically to Enterprise/NAS drives like the N300 though, they’re much higher quality than standard consumer drives.
it would be nice to buy a nas but im trying to save money where i can. maybe in the future.

For the games i want to get more drives for linux so removing those is a good idea anyway since i dont plan on booting windows for much of anything.

i currently have 11.5TB of total drive space. if i remove the steam games thats probably like 3TB. I know i have a 1tb and a 2tb specifically for steam games. My actual windows install is a 1tb drive but only 180gb of used space. Total used space is about 6TB. Honestly not as much as I thought.

Ill see what i can work out for getting drives. I could do like 10x3 or something. Idk the best configs for 3 drive setups.
 
I would try to reproduce it if I didn't just use the net install version every time. How did you manage to get to that point without adding a web mirror?
Oh. Sorry. Let my hate get in the way of that. I see what you're doing now. I was bitching and being partisan. But as you're actually interested in improvement, I'll try give more info when I'm home. I hope you understand.

I used a nightly build. Probably part of the problem. I'll identify it when I get home and run another test for the sake of the chud ecosystem I guess. Cheers.
 
Slow build and shit like recursive dependencies. Apparently if you check the comments for the packages everyone is posing how to unfuck the packages, something that is unnecessary for the Xlibre repository version.
the build time isn't that slow. You might need to adjust your makepkg.conf to use all your threads. The other stuff is definitely and issue with how they've done the pkgbuild though. It should just prompt to remove the package that it conflicts with. At least that's the normal behavior when You install something that conflicts. Normally it doesn't take doing it manually in seperate command. Outside of putting in y to confirm removing the package, or changing it to the new package being installed.
 
the build time isn't that slow. You might need to adjust your makepkg.conf to use all your threads. The other stuff is definitely and issue with how they've done the pkgbuild though. It should just prompt to remove the package that it conflicts with. At least that's the normal behavior when You install something that conflicts. Normally it doesn't take doing it manually in seperate command. Outside of putting in y to confirm removing the package, or changing it to the new package being installed.
I mean that they are slow to push new updates and the updates often have unresolved dependency issues and such
 
I mean that they are slow to push new updates and the updates often have unresolved dependency issues and such
Ah ok.

You know. Listening to the new MATI stream. Hearing nulls complaints at the end. I'm thinking what he could do to get better performance. Idk what his setup is other than running arch.

I also know that he is always running a VPN. So that isn't going to help with his internet speed. Hopefully he is at least using something based on wireguard. But I'm thinking he could probably get his stuff working a lot smoother. If he spent a bit of time to optimize some stuff. Like maybe take advantage of some dns caching. Use some tools that are a bit lighter.

Really don't know what his desktop set up is like. And I have a feeling giving unsolicited advice won't be taken anyway.
 
Ah ok.

You know. Listening to the new MATI stream. Hearing nulls complaints at the end. I'm thinking what he could do to get better performance. Idk what his setup is other than running arch.

I also know that he is always running a VPN. So that isn't going to help with his internet speed. Hopefully he is at least using something based on wireguard. But I'm thinking he could probably get his stuff working a lot smoother. If he spent a bit of time to optimize some stuff. Like maybe take advantage of some dns caching. Use some tools that are a bit lighter.

Really don't know what his desktop set up is like. And I have a feeling giving unsolicited advice won't be taken anyway.
Just link him the arch wiki.
 
I've been on Mint for a week now, and I'm having a much better time than I was worried I would. That said, a big part of that is access to chatgpt. Instead of needing to post any issues I have on a help forum and waiting potentially hours for someone to respond, I can get a few possible solutions in seconds.

For example, I just installed wine and nothing was happening when I double-clicked on a .exe file. Clicking properties>open with, wine wasn't even on the list. So, instead of rooting through the system to try and find it myself, I asked chatgpt where wine is located on my system, and it told me it's in "/usr/bin/wine."

The other day, I asked it why I could read my side drives but not write to them, and it told me that my Windows 10 was probably set to Fast Startup. Sure enough, I rebooted into Windows, turned that off, and it fixed the problem.
 
Really don't know what his desktop set up is like. And I have a feeling giving unsolicited advice won't be taken anyway.
Last he mentioned it Nullikins said he was using Arch as a VM inside Windows for isolation. I'm sure with enough horsepower it would work fine, but also is going to need a ton of tweaking.

Personally I use Linux as my host OS and then run Windows or Linux VMs for isolation. But I'm sure not going to mention any suggestions to Nool.
 
Last he mentioned it Nullikins said he was using Arch as a VM inside Windows for isolation. I'm sure with enough horsepower it would work fine, but also is going to need a ton of tweaking.

Personally I use Linux as my host OS and then run Windows or Linux VMs for isolation. But I'm sure not going to mention any suggestions to Nool.

The reason I hate vms. Is as much as people can talk about type 1 hypervisors and passthrough allowing better performance. When I've tried to get it working well. No matter what it's been worse, and very noticably worse than bare metal. Like I've tried passing all the cores on my CPU except one even to see if it helped, giving it tons of extra memory. It still performed the same. Which is to say, shitty and laggy.

Maybe there is some magic out there I don't know about that allows decent performing vm's. Or maybe the magic is a brand new CPU with 16 cores. And new instructions for better performaing virtualization.
I'm finding things are a smidge easier on Mint then Arch, but I keep being compelled to try Arch based distros. Fuck it, everything just weeks on Mint except for little things I can ignore.
It really just depends what you are trying to do. If one is going to be easier.

I've been on Mint for a week now, and I'm having a much better time than I was worried I would. That said, a big part of that is access to chatgpt. Instead of needing to post any issues I have on a help forum and waiting potentially hours for someone to respond, I can get a few possible solutions in seconds.
Really you wouldn't need to ask. It's getting these answers from people that already did ask. And the answers that were given. And from wiki's.

Not saying that you shouldn't use AI. But at some point it does end up being beneficial to read the documentation a bit. If you are using the computer for simple things. It won't really matter much though.
 
The reason I hate vms. Is as much as people can talk about type 1 hypervisors and passthrough allowing better performance. When I've tried to get it working well. No matter what it's been worse, and very noticably worse than bare metal. Like I've tried passing all the cores on my CPU except one even to see if it helped, giving it tons of extra memory. It still performed the same. Which is to say, shitty and laggy.
Get better hardware.

For me CPU performance is generally fine. Where I find the biggest gap is GUI performance running a Windows VM in KVM/Libvirt. Is there a way to make it better, probably. But I gave up and just used RDP from the Linux host into the Windows VM and that made it much more tolerable.

I'm doing business stuff, no idea about 3d/games/etc.
 
Back
Top Bottom