GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Audio especially seems to really be pinned down to single-thread workloads.
I've been wondering for a long time why this is. Intuitively, I'd think assigning a thread per track and then summing the result together would be an ideal multi-threading workload, but apparently not.
 
Now that I'll be working I've been thinking about my next computer setup and and I wanted your input about which direction to go. I sold my main gaming computer (Rx 5700 level) for some much needed cash a whole back and have been using an intel 6th gen Lenovo X1 yoga that's really struggling at times. I right now have a work iMac on my desk that I put a laptop in front of. So far I narrowed down my options to four that may not be 100% but what I think would be a solution:

1. Buy a used mini PC for $50 and swap it for the computer by the tv, a MSI Trident 3 with Nvidia 1660 graphics. This would be the cheapest option, but would still require some equipment like a KVM and figuring how I'm going to use the iMac as a display or to put a monitor in front of it.

2. A 14" ultraslim laptop - ideally a ThinkPad - that has a 12th gen Intel CPU or equivalent AMD with a very powerful iGPU. This would give me a very flexible system with a similar setup to what I have now, but I wonder if iGPU tech is good enough to run games like Baldur's Gate 3.

3: a 16" gaming laptop - ideally a ThinkPad or legion or other brand known for quality hardware - that has the specs guaranteed to run games I want at full speed, with Baldur's Gate 3 being the most recent and power intensive games that I would like to play. This probably wouldn't be as portable as I'd like.

4. A handheld PC like the Legion Go. This could be set up as a desktop with a dock so I'd need the KVM, but it would be very portable allowing for couch gaming and such on the go.

No matter what I would like to stay under $1,000. I'm willing to buy used as long as I can be certain that the used computer is in like new condition and everything works properly.
I’m not sure what your portability needs are. $1,000 is enough for a decent mini tower build which will be more flexible in the long run (go mATX, mini-ITX is all overpriced). The problem with (recent) ‘workstation’ laptops is that they cool like shit and it just takes some crappy driver or background task to melt your CPU/lap. They also become useless when the fan goes out. I imagine gaming laptops are the same just with even shittier build quality.

I think mini PCs will be similar. This small form factor stuff tries to sell on specs and there is a lot of compromise with thermal management (so you don’t really get peak performance). I think it’s all about paying for portability. You want to find someone who’s actually owned one for more than a few months. Or go Mac because they seem to actually know what they’re doing (and this is from someone who has hated using Apple computers since the Apple II).
 
1. Buy a used mini PC for $50 and swap it for the computer by the tv, a MSI Trident 3 with Nvidia 1660 graphics. This would be the cheapest option, but would still require some equipment like a KVM and figuring how I'm going to use the iMac as a display or to put a monitor in front of it.
I'd do this but probably with a Raspberry Pi 5, 8GB. Perfectly fine little TV box and then you can save some money and research until you can get what you really want. And then you have a Raspberry Pi to curse at.

Or a mini PC.

I like gaming laptops, but at home mine lives on a cooling pad and on the road there's a reason it has 40GB RAM and 3TB disk. Probably not a good solution for everyone.
 
Now that I'll be working I've been thinking about my next computer setup and and I wanted your input about which direction to go. I sold my main gaming computer (Rx 5700 level) for some much needed cash a whole back and have been using an intel 6th gen Lenovo X1 yoga that's really struggling at times. I right now have a work iMac on my desk that I put a laptop in front of. So far I narrowed down my options to four that may not be 100% but what I think would be a solution:

1. Buy a used mini PC for $50 and swap it for the computer by the tv, a MSI Trident 3 with Nvidia 1660 graphics. This would be the cheapest option, but would still require some equipment like a KVM and figuring how I'm going to use the iMac as a display or to put a monitor in front of it.

2. A 14" ultraslim laptop - ideally a ThinkPad - that has a 12th gen Intel CPU or equivalent AMD with a very powerful iGPU. This would give me a very flexible system with a similar setup to what I have now, but I wonder if iGPU tech is good enough to run games like Baldur's Gate 3.

3: a 16" gaming laptop - ideally a ThinkPad or legion or other brand known for quality hardware - that has the specs guaranteed to run games I want at full speed, with Baldur's Gate 3 being the most recent and power intensive games that I would like to play. This probably wouldn't be as portable as I'd like.

4. A handheld PC like the Legion Go. This could be set up as a desktop with a dock so I'd need the KVM, but it would be very portable allowing for couch gaming and such on the go.

No matter what I would like to stay under $1,000. I'm willing to buy used as long as I can be certain that the used computer is in like new condition and everything works properly.

Gaming laptops aren't very portable. I have one, and it's plenty powerful for gaming, but has a 2.5 hour battery life and a 200W PSU. For under $1000, though, there are some very good options, like this:


I'd probably go with option (1), the Trident has another year or two of life left in it as a gaming PC.
 
Intel will be pushing up against Windows' hard limits with their consumer processors shortly, so demonstrating the issue they're about to have with an AMD server CPU makes sense.
I assume you're referring to the rumored Arrow Lake Refresh with 8+32 cores? That's only supposed to have 40 threads, and Intel may ditch hyperthreading with Arrow Lake and Lunar Lake.
 
Last edited:
I assume you're referring to the rumored Arrow Lake Refresh with 8+32 cores? That's only supposed to have 40 threads, and Intel may ditch hyperthreading with Arrow Lake and Lunar Lake.
Nah, I meant the lower-end Xeon W (higher end is too expensive to qualify as consumer). Sapphire Rapids is a dumb purchase, but there is a certain group of people who are adamant about never buying an AMD processor, no matter how silly that makes them. Sapphire Rapids already contains processors that offer more than 32 cores, so it's already a problem for the very high end stuff, but since Intel are pressing hard to compete with Threadripper and EPYC again, they're bound to stuff more cores into the next generation.
Xeon W7 tops out at 28 cores for 2900USD. Threadripper 7970X is 32 cores for 2500USD. There's no way Intel aren't aiming to match that with next generation (of course Threadripper goes all the way to 64 cores in the non-pro line, and they'll happily sell you 128 cores if you go EPYC, but I wouldn't call that consumer).
 
Sapphire Rapids is a dumb purchase

At this point, the only reason people in my corner buy Xeon CPUs is EPYC's out of stock everywhere.

Xeon W7 tops out at 28 cores for 2900USD. Threadripper 7970X is 32 cores for 2500USD. There's no way Intel aren't aiming to match that with next generation (of course Threadripper goes all the way to 64 cores in the non-pro line

64 cores with only 4 memory channels is a waste of money. AMD had a 64c Threadripper with 4 channels of DDR a few years ago, and it was awful. Going up to DDR5 will be less awful, but it still won't be performing well enough to justify the price. I guess it will do a good job on compiling workloads, but there are a ton of things where the memory system will be absolutely choked, especially engineering workloads. Even with 32 cores, you're pushing the limits of usefulness. If you're running something like Ansys and building a workstation, if you're looking at 32+ cores, go with EPYC. Even at 16 cores, EPYC with 3D V-cache is a massive performance boost over Threadripper.
 
At this point, the only reason people in my corner buy Xeon CPUs is EPYC's out of stock everywhere.



64 cores with only 4 memory channels is a waste of money. AMD had a 64c Threadripper with 4 channels of DDR a few years ago, and it was awful. Going up to DDR5 will be less awful, but it still won't be performing well enough to justify the price. I guess it will do a good job on compiling workloads, but there are a ton of things where the memory system will be absolutely choked, especially engineering workloads. Even with 32 cores, you're pushing the limits of usefulness. If you're running something like Ansys and building a workstation, if you're looking at 32+ cores, go with EPYC. Even at 16 cores, EPYC with 3D V-cache is a massive performance boost over Threadripper.
I'm pretty sure most people buying Threadrippers today are getting either the (sensible) 32 core one, or only go 64-core because it's the one Linus said is best. I guess the 16 core sort of makes sense too, those cores will have plenty of memory bandwidth, and that memory can be RDIMMs unlike the 7950X. If your task is memory constrained and you want lots of cores also, definitely worth spending up on the EPYC. I think AMD understands that too, it's the only way I can explain the 8 memory channel Threadripper Pro line. Makes little sense to have so much product line overlap otherwise. But why not just sell them under the EPYC label like they effectively already are? Putting the Threadripper label on it just cheapens the product, EPYCs are much cooler.
 
I'm pretty sure most people buying Threadrippers today are getting either the (sensible) 32 core one, or only go 64-core because it's the one Linus said is best. I guess the 16 core sort of makes sense too, those cores will have plenty of memory bandwidth, and that memory can be RDIMMs unlike the 7950X. If your task is memory constrained and you want lots of cores also, definitely worth spending up on the EPYC. I think AMD understands that too, it's the only way I can explain the 8 memory channel Threadripper Pro line. Makes little sense to have so much product line overlap otherwise. But why not just sell them under the EPYC label like they effectively already are? Putting the Threadripper label on it just cheapens the product, EPYCs are much cooler.

Recall that AMD originally canceled the non-pro Threadrippers because they just didn't sell.

However, now that they're 95%(!!!) of the workstation market, my guess is that the marginal use cases for 64c Threadripper are now enough to justify the product. I'm thinking especially of a dev station. I worked on a code base once that took 2 hours to compile. Compiling isn't bandwidth-bound, so for developers working on large, bloated C++ code bases, this could be a good choice.

As for why 8 channels in Pro, the EPYC I/O die is the largest die on the CPU by far, so this probably allows them to sell TR Pro at a meaningfully lower price and get margin (and perhaps clock it higher). If you need 12 channels, go EPYC.
 
whats a good monitor for a 5600x and 6700xt combo?
it has to be reasonable priced and not make me lose performance
 
whats a good monitor for a 5600x and 6700xt combo?
it has to be reasonable priced and not make me lose performance

I have an LG 160 Hz 1440p monitor, as my 6700xt can't really handle 4K. I actually got rid of my Samsung 4K (I gave it to my wife) because I just couldn't really use it, not even at 60 Hz. AMD screwed up big on this card. There isn't enough memory bandwidth, and FSR is garbage.
 
So should i stick to 1080?

A 6700 XT is a lot better than a 1080. But if you're buying a newer card the 30 series GeForces are a lot better than the 6000 series Radeons at similar price points. DLSS is a big part of that.
 
im talking about the monitor.
6700XT will run most games with tolerable framerates on High at 1440p, drop down to 1080p if you want Ultra without FSR (you definitely don't want to use FSR below 4k, it'll look like shit).
But you know, you can just use a higher resolution monitor at a lower resolution. Turn on integer scaling and it'll look just as crisp as a lower-resolution monitor would. I don't see why anyone wouldn't get a 4k 120Hz monitor today, desktop will look very crisp, scaling is mature in all three major OSes, and the higher frame rate makes the cursor and various animations look so much smoother.
 
Back