GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

I've said this before, but it's funny how many home PC builders act like they're pro streamers/3d modelers/machine learning devs that just have to have shit like CUDA/nvenc/ muh tensor cores to live.

Oh, and they totally only play rtx games.

Oh, and AMD drivers killed their dog or something 10 years ago.

*Edit* it's also funny that despite AMDs low market share, everyone totally has owned an AMD card recently/ seemingly knows people that have that have ALL had issues. Amazing.
That sounds like the social media/youtuber effect. I blame idiots like Linus Tech Tips who convinces his low iq fanbase that you need only the latest and biggest cpu/gpu to do basic web browsing otherwise you'll be an inadequate pleb behind your reddit tags.
 
Most users tend to only buy Nvidia due to mind share, which is retarded as fuck. I blame the current state of the market on Nvidiots, as usual
Also not helping: people using AMD as a scapegoat to buy nVidia cards at reduced prices.

If AMD announced RDNA3 pricing that aims to undercut the 40 series, nVidia might get scared to reducing prices, whereupon customers buy nVidia GPUs. Those people never intended to buy AMD cards no matter how hard they undercut.
That sounds like the social media/youtuber effect. I blame idiots like Linus Tech Tips who convinces his low iq fanbase that you need only the latest and biggest cpu/gpu to do basic web browsing otherwise you'll be an inadequate pleb behind your reddit tags.
I think it's something more fundamental than that though.

You CAN set up a gaming rig with a 3050ti and an i3 Intel processor and play many games and browse the Internet without much issue, but why are you cheaping out on your computer components? Surely something that you use everyday (you think) deserves higher priced components...
 
Also not helping: people using AMD as a scapegoat to buy nVidia cards at reduced prices.

If AMD announced RDNA3 pricing that aims to undercut the 40 series, nVidia might get scared to reducing prices, whereupon customers buy nVidia GPUs. Those people never intended to buy AMD cards no matter how hard they undercut.
Yup. Hence why I laugh when people whine about AMD prices. AMD knows the game. Most of them were never going to buy them anyways.

There are people who still believe rdna2 cards aren't competetively priced. To them AMD should be a cheap budget brand
 
And the old AMD reputation. Back in 2008 you really needed nVidia to if wanted your games to run right. I remember swearing off of Radeon for that reason. That was 14 years ago though.
I used a HD4850 back then lmao, and I'm not even an AMD fan.
BTW I still have that card and it can run Windows 10 just fine and some basic games, even with its ancient drivers.
It could run Crisis at 1080p with about 40fps if I remember correctly, and Very High or so settings, without crazy AA tho'
 
You CAN set up a gaming rig with a 3050ti and an i3 Intel processor and play many games and browse the Internet without much issue, but why are you cheaping out on your computer components? Surely something that you use everyday (you think) deserves higher priced components...
Because there's value and return on investment.

I technically have the money to buy a 12900 with a 3080 and an obscene amount of ram. Let's say for sake of argument the full build would be £3000

Or I can spend £800 and get a 5600x and 6600 a that does 99% of the same things, leaving me with £2200 I can spend on pizza or a car or a laptop or whatever else.

I don't need a 12900k to watch YouTube or type word documents. This is why @Just Some Other Guy jokes that everyone is a pro streamer or AI dev, because those are the only things that really require that kind of insane power. Both computers will be "obsolete" in 5-10 years anyway, and the only CPU tasks that would benefit the extra power is video compression, and even that is becoming less of a problem if AV1 hardware encoding works as advertised.

I used a HD4850 back then lmao, and I'm not even an AMD fan.
BTW I still have that card and it can run Windows 10 just fine and some basic games, even with its ancient drivers.
It could run Crisis at 1080p with about 40fps if I remember correctly, and Very High or so settings, without crazy AA tho'
nVidia example, but I remember when New Vegas released and for a week or so the game would drop to single digit framerates in areas with multiple people. Turned out to be an issue with rendering faces on certain nVidia cards. I think it was eventually fixed in a driver update. It was shit like that, and it felt like it was more common for AMD.

These days it feels like the opposite. AMD cards are supposedly the best for retro gaming PCs because while nVidia cut features on older drivers, AMD kept adding them. So you can play XP games with modern features like super sampling and integer scaling. It's still early days for my 6600, so it's too early for me to judge.
 
These days it feels like the opposite. AMD cards are supposedly the best for retro gaming PCs because while nVidia cut features on older drivers, AMD kept adding them. So you can play XP games with modern features like super sampling and integer scaling. It's still early days for my 6600, so it's too early for me to judge.
That's a good card, I built my gf a 12400 build with a 6600 in it.
 
I don't need a 12900k to watch YouTube or type word documents. This is why @Just Some Other Guy jokes that everyone is a pro streamer or AI dev, because those are the only things that really require that kind of insane power. Both computers will be "obsolete" in 5-10 years anyway,
The super high-end that doesn't need it feels like basic botox bitches in a way. "but if fills the entire memory" sure, I fill my glass with water, I'm not parched or worried about exceeding kitchen glass capacity and die of dehydration. When you are actually running into VRAM limitations you feel it something fierce. Filling the entire glass does not mean you need a jug. Unless you need a jug, but that's different.

edit: NOT MY POINT! Arc reviews are coming out, currently watching Hardware Unboxed. Interesting, let's see what happens.
 
What a shame, I was hoping to see something positive but based on that video (and nothing else to be honest) these look DOA. Honestly, if anyone could have given us a third option it should have been Intel. Waste of an opportunity when consumers are especially pissed at nVidia and often consider AMD as an afterthought.
 
Fucking Linus Tech Shill telling his viewers to buy Intel cards because "we need more competition" and it doesn't matter if they're bad or if they aren't Linux compatible. As usual, he barely talked about AMD.
Rumor has it that Intel discrete graphics is cancelled internally, and that it will have some token releases and be killed. They obviously aren't going to announce that when they have to sell the 4 million GPUs they made, which were already delayed too late to take advantage of multiple quarters of sky high GPU prices.

Some of these could be useful to someone. Why not pick up an Arc A310?
 
Rumor has it that Intel discrete graphics is cancelled internally, and that it will have some token releases and be killed. They obviously aren't going to announce that when they have to sell the 4 million GPUs they made, which were already delayed too late to take advantage of multiple quarters of sky high GPU prices.

Some of these could be useful to someone. Why not pick up an Arc A310?
If they're actually nuking their dGPU effort they will also start to wind down the team updating the drivers in maybe a year. That's really bad for the consumer.
 
Back