GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Can someone recommend me a good gpu in the 150-200€ price range? It will be just for 1080p gaming so nothing too demanding. I'm willing to buy used too, like I seem to find quite a lot of used rx 6600 (xt) and rtx 3050 for that price.
For 200€ you should be able to buy a new 6600XT. But either way, that's what I'd buy if I were you. It's better value than RTX 3050.
 
For 200€ you should be able to buy a new 6600XT. But either way, that's what I'd buy if I were you. It's better value than RTX 3050.
A friend got a 3050 for 1080p and wound up moving on from it, its fairly limiting if you want to play anything newer even at low settings. Note that its in the ballpark of a 1070 performance wise.

Although not in your price range I would suggest at least trying to get a 3060 if not a 3060 ti. I cannot make specific AMD recommendations.

Cheapest new 6600XT I could find was 350€ from Germany. I did find used ones for around 180€ though and surprisingly quite a lot of used RTX 3060s for a similar price. There is one used RTX 3060 ti for 190€ that I found from a reputable seller, would it be a better value than the 6600XT?
 
Cheapest new 6600XT I could find was 350€ from Germany. I did find used ones for around 180€ though and surprisingly quite a lot of used RTX 3060s for a similar price. There is one used RTX 3060 ti for 190€ that I found from a reputable seller, would it be a better value than the 6600XT?
Interesting prices... The whole point of AMD is that their GPUs are about 200€ cheaper than their NVIDIA equivalent, with some minor drawbacks like slightly worse upscaling and maybe some software not being that well optimised for them. The performance gap's getting smaller every year though. so we're at the point where AMD is a better buy in most cases.

If they're priced the same, just go with the 3060 Ti
 
  • Like
Reactions: BaldCharles
Lol no. Examples?
The fuck you mean Lol no, AMD cards get absolutely shit on on rendering workloads e.g. Blender
AMD cards are abysmal with machine learning because they've completely given up on that field, even after being handed a CUDA transpiler on a silver platter
AMD drivers have been constantly meme'd on for a decade, I get to see this firsthand since Wicked Engine has AMD specific issues crop up on the regular
 
Can't ever help someone pick computer parts without causing an AMD vs NVIDIA sperg out, can we?

>Discord screenshot
Must be true if some fag from a LGBT hugbox says so (he even points out not all AMDs have issues)

Yeah, most people don't care about this. Especially on a budget like the guy I was talking to. But what if you're a somewhat curious about Blender normie? A RTX 3050/3060 won't do magic if used for rendering or machine learning, so normally you'd spend a couple extra hundred bucks for something that's only marginally better and may not even interest you in the long run. If you care about tech stuff you already know NVIDIA usually wins on Windows, and that on Linux AMD has more potential.

I do some autistic shit on my Windows PC myself, but only casually. I don't mind waiting a bit longer for something to render or finding a workaround to make it function. And things ARE getting better and easier for AMD users. New people don't deserve to be scared away by problems from the past.
 
Must be true if some fag from a LGBT hugbox says so (he even points out not all AMDs have issues)
amd.png
Yeah, most people don't care about this. Especially on a budget like the guy I was talking to. But what if you're a somewhat curious about Blender normie? A RTX 3050/3060 won't do magic if used for rendering or machine learning, so normally you'd spend a couple extra hundred bucks for something that's only marginally better and may not even interest you in the long run.
I said this, go with AMD if you're saving money and only care about games
 
Can someone recommend me a good gpu in the 150-200€ price range? It will be just for 1080p gaming so nothing too demanding. I'm willing to buy used too, like I seem to find quite a lot of used rx 6600 (xt) and rtx 3050 for that price.
I don't know your local market but if you were a burger I'd recommend a used RTX 2080/Ti, which show up on ebay for about $200 pretty regularly. And that's a good guideline - see if you can pick up a high-end Turing card (RTX 2000-series). You might also go check used listings once the 5090/5080/5070 launch happens as I get the feeling a lot of people are going to be dropping their ampere (RTX 3000-series) cards for cheap. If you can snag a 3070 or 3070 Ti for $200, you'll be set for the rest of this console generation as those cards outperform the PS5 Pro.

For AMD, a used RX 6700 or 6800 is probably feasible. AMD cards don't seem to hit the secondhand market as regularly so you might be better off looking at team green.

Don't buy anything older than RDNA2 or Turing. RDNA1 doesn't have raytracing at all and Nvidia is winding down support for the GTX 1000-series.
 
AMD's non-gaming cred might get better after RDNA and CDNA are rolled back into each other as "UDNA" (with RDNA5 presumably cancelled in favor of UDNA1). The funny part is that they will probably change their naming scheme after one generation, aligning with this change to UDNA. Unless they want to release GPUs like an "RX 10070 XT".

MLID thinks AMD is waiting to price/release RDNA 4 until after the first wave of Blackwell reviews are out there, to help first contextualize Nvidia's generation performance uplift claims at CES which relied on frame generation without making that fact super obvious to Joe Normie GPU consumer. - and also to make sure FSR 4 is as competitive as it possibly can be.
AMD says RDNA 4 delay is due to software and FSR 4, confirms stock is already at retailers (archive)

We've heard all the possible reasons. Now we have to wait and see if it was the right call.

AMD denies 9070 XT leaked prices — '$899 USD starting price point was never part of the plan'
While we aren’t going to comment on all the price rumors, I can say that an $899 USD starting price point was never part of the plan.
This should have been obvious but there we go.
 
Last edited:
AMD cards are abysmal with machine learning
AMD Instinct cards are apparently gaining popularity in datacenters, but they are behind in the consumer space for people attempting to run things such as SD on Windows. Supposedly it works a lot better on Linux where ROCm support is available for your card but I haven't tried it.

For those interested in running Linux, AMD is the best option given how mature and stable the open source drivers are.
 
  • Like
Reactions: Kees H
It's also no longer relevant. Nvidia's "open" drivers work just fine, AMD's advantage has been reduced from "works at all" to "works out of the box", since Nvidia's driver has yet to make it into the kernel proper and still requires you to install from your package manager, which is generally a single command anyway.

I like AMD, but I no longer recommend their GPUs even for Linux users. Nvidia's driver works just as well, nvidia-smi is a smoother way to handle power limits and resource management than AMD's method of directly writing values (which looks like "echo 50000000 > /sys/class/drm/card0/device/hwmon/hwmon0/power1_cap", vs. Nvidia's "nvidia-smi -pl 50" for setting the power limit to 50W). AMD can run AI with things like rocm-pytorch, but Nvidia just runs CUDA directly. Nvidia's raster performance is still a bit lower per money spent on the card, but in exchange you get DLSS, which is just better than XeSS and FSR. And because Nvidia are more popular, they get more support.

It's a shame, the market desperately needs AMD and Intel to be competitive.
 
Was going to joke that a response would be "muh AMD drivers" but was like nah, surely that's overused...

...lo and behold. Lol.
You're right, drivers don't matter, go uninstall yours and tell me how that goes
Just yesterday I had to disable an AMD service from the igpu that was writing gigabytes of useless log shit
Their software is miles behind.

microsoft's basic display adapter might be slow as shit but at least it doesn't randomly timeout
 
Last edited:
Was going to joke that a response would be "muh AMD drivers" but was like nah, surely that's overused...

...lo and behold. Lol.

AMD Instinct GPUs are unable to realize a large part of their theoretical performance in AI training because the drivers are shit, while NVIDIA gets very close to the rated performance, so the end result is same-rated NVIDIA hardware significantly outperforms AMD. They do fine in HPC computations.

On the gaming side, a good example of AMD just being garbage is how Fluid Motion Frames was just randomly disabled for a couple months by a bug that made my Adrenalin software suddenly stop detecting that a game was running in the acceptable conditions. There was also a cool installation bug a while back that prevented me from uninstalling it (because the update broke it) without downloading a special uninstallation workaround they have on their website.

I am not sharing details for PL reasons, but AMD has the worst software practices I have ever seen at a large company by a long shot. NVIDIA just doesn't fuck up like they do. Intel is not great, but still less bad.

Don't buy anything older than RDNA2 or Turing. RDNA1 doesn't have raytracing at all

RDNA2's raytracing support is useless. At absolutely minimal raytracing settings in the games I have that support it, my 6700 XT chugs along at like 35 fps, and by that point, the effect is so small as to not really be noticeable.

Cheapest new 6600XT I could find was 350€ from Germany. I did find used ones for around 180€ though and surprisingly quite a lot of used RTX 3060s for a similar price. There is one used RTX 3060 ti for 190€ that I found from a reputable seller, would it be a better value than the 6600XT?

Yes, get the 3060 Ti. With a budget card, the qualitative effect of upscaling technology and memory bandwidth are more significant, as in you will see a much bigger visual difference between the resolution and effects you can enable on a 3060 Ti compared to a 6600 XT.
 
Why is the market so afraid of the chinese AI? Isn't it built using NVDA chips?
Multiple factors, its almost as good as the latest from openAI but open source so you can run it in your rig or server for free instead of paying $200/mo or more to altman who in turn will see everything you put on it.

Its apparently waaaaaay more efficient probably due to optimizations because we been blocking chinese access to the latest hardware, that means chinese code is better than ours now and that's a huge problem.

And because its more efficient you need less hardware, which means less datacenters, which means nvidia and others getting billions in orders are SOL, a huge chunk of their production has suddenly become redundant.

Lastly there's the possibility that just like the chinese pulled deepseek out of their asses pwning the entire industry they might also have something cooking on the hardware side, and if it is as big as deepseek then nvidia its done for, they still have gaming but their stock and valuation will go to shit if they lose the AI market.
 
Lastly there's the possibility that just like the chinese pulled deepseek out of their asses pwning the entire industry they might also have something cooking on the hardware side, and if it is as big as deepseek then nvidia its done for
Unless the Chinese are about to leapfrog ASML in laser technology, Nvidia has nothing to worry about from China.
 
Unless the Chinese are about to leapfrog ASML in laser technology, Nvidia has nothing to worry about from China.
The real question is what if they are doing LPUs like Groq? or transformers-on-a-chip like (I think) google was working on? basically bypassing GPUs for chips that are actually designed for AI.
 
The real question is what if they are doing LPUs like Groq? or transformers-on-a-chip like (I think) google was working on? basically bypassing GPUs for chips that are actually designed for AI.
The answer is anything they do on their 7nm process can be done better on TSMC's 2nm process. Chip companies steal ideas from each other all the time. You are only ever one generation away from being smoked by the competition if you rest on your laurels, and if you're already two generations behind in your fab, you're not going anywhere fast.
 
Back