GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Amd is actually retarded. i am a little bit of a fanboy and i can admit it. They lack vison and aren't willing to do things that would solidly put them at #1. They have many research papers detailing what they could do be it with software/hardware but they never make feature complete gpus to act on any of it and the software/hardware is always lacking (as you have pointed out).

They are the Han. They will wait until Nvidia gets either financially nuked by something or actually nuked by a foreign beligerent when they are shown to be more strategically vital than the Pentagon.

No big swings from AMD. Not even for the lowly gaming and "content creator" consumer demographics. You are all just hungry dogs to them that should be thankful they have not exited the market. If the CCP could pull off the same circular funding arrangements as Wall Street has for Nvidia they would.
 
it is cheaper, and the performance is the same as the nvidia counterpart for what i use it for.
I read this as, it's cheaper, so you try to convince yourself the performance is just as good in order to feel like you got an amazing deal. When the 9000 series came out, I seem to recall it initially wasn't much cheaper than the 50 series, but the market started discounting it once it turned out that AMD was behind again.

I have a 9070xt had a 3070/3080 and honestly the 9070xt is a better experience than both of them.
Yeah but it's like 2 generations newer. The equivalent AMD card to the 3060 is the 6700 XT and I can tell you, the minute I saw what the 30 series could do when i got a laptop with a 3050, I was done with AMD.
 
Last edited:
I've had fine performance with AMD and Nvidia. They both have their pros and cons. AMD can be pretty decent and the experience on Linux is pretty painless. Nvidia's not too bad on Linux either nowadays, but can be funny sometimes. I use Nvidia because I need CUDA for some things as well as low power consumption. Don't have an AMD card to try ZLUDA at the moment, but I'm sure it's fine. Both companies are pretty frustrating as a consumer right now.

Nvidia's in a position where they don't really have to care about gaming performance at all. I think it was around the 30 series when their rasterization wasn't any better than the previous gen because they were prioritizing ray tracing which led to a small controversy. Then with the 50 series, they dropped PhysX which broke games while barely offering much of a performance uplift on the lower end. Again, Nvidia doesn't care too much about this market now because they're more concerned about being the backbone of the entire AI industry.

As for AMD, I think they're stuck and don't know what market to chase since Nvidia's cornered almost everything. I've heard their cards are decent at raytracing now if that's something people need. I don't know. Overall, my experience with every AMD card I've had has just been "decent". I never felt like I had some kind of game changer on my hands like I did with a GTX 1080.
 
I read this as, it's cheaper, so you try to convince yourself the performance is just as good in order to feel like you got an amazing deal.
At the time when I bought the 6800XT that I have, the NVIDIA cards were twice the price and wasn't really worth it unless you really cared about ray-tracing (which a lot of people don't).

The only reason to buy a NVIDIA card over an AMD card often is for CUDA or using certain pieces of software usually video editing suits.

TBH you can get a better deal buying one of the desktop macs with unified memory if you only care about AI and video editing.
Yeah but it's like 2 generations newer. The equivalent AMD card to the 3060 is the 6700 XT and I can tell you, the minute I saw what the 30 series could do when i got a laptop with a 3050, I was done with AMD.
I don't understand what point you are making the 6700XT. I think was cheaper/or the same price as 3060Ti (which is the equivalent card). There are pros and cons about buying either card.

The last time NVIDIA released a card that was really worth buying, it was the 1080Ti (literally the GOAT) and that was almost a decade ago now.

Nvidia's in a position where they don't really have to care about gaming performance at all. I think it was around the 30 series when their rasterization wasn't any better than the previous gen because they were prioritizing ray tracing which led to a small controversy. Then with the 50 series, they dropped PhysX which broke games while barely offering much of a performance uplift on the lower end. Again, Nvidia doesn't care too much about this market now because they're more concerned about being the backbone of the entire AI industry.
PhysX that they broke was 32bit Physx. So lots of older titles suffered. There is now mods / workarounds that fix this.

The 1080 Series cards were NVIDIA's biggest mistake because they were so good that it took them two generations to beat.
 

Interesting, I never realised that there where other CPU/GPU's out there, but it seems like they have a long way to go.
They're used a bit by governments that don't really want other nations spying on their super confidential stuff. You can't really 100% trust a system unless you built it yourself.

Weird question, my server won't boot if I have a tv or monitor plugged into it. Is there something where an older Xeon with a x99 chipset gets pissy if it tries to use a modern Intel A380 in the boot process?
 
Literal definition of being Canadian. Fast to market success followed by falling behind due to smug imperialist worldview, then failing because you were irreversibly retarded and having to sell out to China.
Ah yes... the classic AdLib-gets-cloned-and-then-leapfrogged-by-Creative-Labs story.
 
Nvidia's in a position where they don't really have to care about gaming performance at all.
And yet, they keep delivering the best gaming performance.

I think it was around the 30 series when their rasterization wasn't any better than the previous gen because they were prioritizing ray tracing which led to a small controversy.
It was the 20 series. NVIDIA bet the future of performance was in more realistic lighting and inferencing acceleration, not another 10% more raster pipes...and they were right. Just like when back in 2001, they bet that the future of gaming was programmable pipelines, not just a bigger, fatter T&L unit.

Then with the 50 series, they dropped PhysX which broke games
No, they didn't "drop PhysX," and it didn't "break games." They dropped the 20-year-old 32-bit version, which was only used by a handful of dead 360-era games with virtually no active players, like Mirror's Edge. And it didn't break them, you just had to turn off the PhysX option (which never worked on AMD anyway).

The only reason anybody knew PhysX didn't work in Mirror's Edge was Vex was shrieking and crying about it. People love to get mad about games they don't play.
1777351303350.png


while barely offering much of a performance uplift on the lower end
For the nonexistent gamer who can't afford a midrange card, but buys a brand-new low-end card every year, it was a disappointment. For normal people who buy a card every 3-5 years, it was and is better than anything AMD has out in the same tier.

I don't understand what point you are making the 6700XT. I think was cheaper/or the same price as 3060Ti (which is the equivalent card). There are pros and cons about buying either card.
The "equivalent" to the 6700 XT was the 3070 Ti, and the point I'm making it was a piece of shit. The RDNA2 cards all had 20%-30% more fill rate than Ampere...but then AMD cut corners on the memory interface, so they didn't actually have enough bandwidth to use it. They slapped a few more low-cost memory chips on the board, but again, not enough bandwidth for it to be useful. I continually found that any game that actually used 12 GB of VRAM would just shit its pants until I cut back the texture resolution to keep it in the 8 GB envelope. So what was even the point? Their raytracing performance was so bad that AMD even saying they could raytrace was regarded as a borderline lie (literally useless, I couldn't even run Diablo IV on low RT settings without random 10 fps dips), and of course, FSR just plain looked like shit, while NVIDIA had just launched a new version of DLSS that looked borderline indistinguishable from native res. All-around just a garbage card; wish I'd known the people recommending AMD were just fanboys hyping up dog shit.
 
It was the 20 series. NVIDIA bet the future of performance was in more realistic lighting and inferencing acceleration, not another 10% more raster pipes...and they were right.
It was because Nvidia was so far ahead that they could blow so much money on experimental tech and pay devs to put it in their games. It's not that Nvidia was right, they were the ones steering the ship.
 
The "equivalent" to the 6700 XT was the 3070 Ti, and the point I'm making it was a piece of shit. The RDNA2 cards all had 20%-30% more fill rate than Ampere...but then AMD cut corners on the memory interface, so they didn't actually have enough bandwidth to use it. They slapped a few more low-cost memory chips on the board, but again, not enough bandwidth for it to be useful. I continually found that any game that actually used 12 GB of VRAM would just shit its pants until I cut back the texture resolution to keep it in the 8 GB envelope. So what was even the point? Their raytracing performance was so bad that AMD even saying they could raytrace was regarded as a borderline lie (literally useless, I couldn't even run Diablo IV on low RT settings without random 10 fps dips), and of course, FSR just plain looked like shit, while NVIDIA had just launched a new version of DLSS that looked borderline indistinguishable from native res.
You compare on cost. A 6700XT was half the price of a 3070TI.

Nobody who bought an AMD card cared about Ray Tracing. Everyone knew that their cards sucked for Ray Tracing performance.
All-around just a garbage card; wish I'd known the people recommending AMD were just fanboys hyping up dog shit.
TBH you sounds as much of a fanboy as the people you are complaining about.
 
Last edited:
Unless you're a young guy who still thinks buying electronics is a team sport. Then I get it, okay, it's brand loyalty.
1) Control flow Integrity doesn't work with Nvidia proprietary modules. Or you want libcxx instead of libstdcxx
2) AMD is stuck in the issue of being Intel but worse. They spent the last decade catching up to intel... Only to get the fuck blown out by Nvidia, but if they catch up to Nvidia in a decade Intel will probably eat their lunch.
3) AMD also has shit strategic foresight, remember AMD Fusion and how they sold off the ARM/Qualcomm low power division right before the iphone was a thing?
4) I like my dual use pc as a space heater when its -20C outside.
 
I have a soft spot for ATI, and don't want to feed the green machine whenever possible, simple as. I won't deny that I'm fucking livid about the FSR shit AMD has pulled and that autists can nigger-rig up support for older cards yet AMD won't do it or are Biding Their Time... Madness and it pisses me off to no end.

I think that unless something changes radically in the near future I will unfortunately have to bend my knee to Jensen-sama and pay the Emerald Coal Toll. Sad.... But that's a-ways away because lol at there being any good fucking reason to upgrade GPUs right now.
Meanwhile, a 9070 XT lists on Newegg for around $740, while a 6700 XT, a card that is objectively inferior in every conceivable way, lists for $790. What the fuck is happening in the GPU market?
So you're saying I should be selling the 6750XT that is gathering dust in a box right now? I was holding onto it to build a simple computer for my kids when they're slightly older but....
Insanity. I bought my card back before things went crazy, but I will say that my 6750xt was a damned good card that handled everything I threw at it pretty well.
I really liked it and it did exactly everything I asked of it and no more. Plus I think it only cost me something like ~$350 CAD on sale direct from AMD.
 
So you're saying I should be selling the 6750XT that is gathering dust in a box right now? I was holding onto it to build a simple computer for my kids when they're slightly older but....

Probably. My 6800 got me through the bad times and was price cut. I can't get a used beaten to death one for what I paid new to double my vRAM so I moved on with R9700s. I think the 6750XT had a pretty desirable hashrate/performance/cost/power curve IIRC.
 
To interrupt the benchmark slapfight with some benchmarks.
I finally, well, almost, have all my "PC" shaped boxes running on 10G Ethernet. I have an ITX system and tried out the new Realtek 8159 USB. My gaming PC I "upgraded" from an AQC card that dropped every hour to a Realtek 8127. Here are the benchmarks.
iperf3 -c HOSTNAME -t 20 plus the options listed, -R to reverse the flow, -P2 to try 2 streams. All tests run from the named device to a server with 2x 10G Broadcom in a LACP.
Code:
[ ID] Interval           Transfer     Bitrate         Retr
USB RTL 8159 on a USB 3.2 x2 connection.
[  5]   0.00-20.00  sec  21.9 GBytes  9.42 Gbits/sec    0            sender
-R
[  5]   0.00-20.00  sec  3.79 GBytes  1.63 Gbits/sec  13175            sender
-P2
[SUM]   0.00-20.00  sec  21.9 GBytes  9.42 Gbits/sec   34             sender
-P2 -R
[SUM]   0.00-20.00  sec  3.80 GBytes  1.63 Gbits/sec  29699             sender

PCIe x4 Intel X550
[  5]   0.00-20.00  sec  21.9 GBytes  9.42 Gbits/sec    0             sender
-R
[  5]   0.00-20.00  sec  21.9 GBytes  9.41 Gbits/sec   85             sender
-P2
[SUM]   0.00-20.00  sec  21.9 GBytes  9.42 Gbits/sec    0             sender
-P2 -R
[SUM]   0.00-20.00  sec  21.9 GBytes  9.42 Gbits/sec   99             sender

PCIe x1 6.12.73+deb13-amd64 Aquantia Corp. AQtion AQC113 NBase-T/IEEE 802.3an Ethernet Controller [Antigua 10G]
[  5]   0.00-20.00  sec  21.9 GBytes  9.42 Gbits/sec    0            sender
-R
[  5]   0.00-20.00  sec  21.9 GBytes  9.41 Gbits/sec  343            sender
-P2
[SUM]   0.00-20.00  sec  21.9 GBytes  9.42 Gbits/sec    0             sender
-P2 -R
[SUM]   0.00-20.00  sec  21.9 GBytes  9.42 Gbits/sec    0             sender

PCIe x1 6.16.12+deb13-amd64 Realtek Semiconductor Co., Ltd. Device 8127
[  5]   0.00-20.00  sec  19.4 GBytes  8.32 Gbits/sec    0            sender
-R
[  5]   0.00-20.00  sec  21.9 GBytes  9.42 Gbits/sec    0            sender
-P2
[SUM]   0.00-20.00  sec  19.4 GBytes  8.32 Gbits/sec    0             sender
-P2 -R
[SUM]   0.00-20.00  sec  21.9 GBytes  9.41 Gbits/sec    0             sender
So, with the current Realtek provided(out of tree) USB kernel module, it sucks. And it actually crashes. That system is ITX and has a spare M.2 port on the bottom of the board, helpfully located under a cover plate which means I have to pull it all the way apart. But I'm going to finally do that and put in an M.2 to PCIe extension and try my other AQC113 card that was dropping every hour. It seemed like power saving, but all the settings seemed correct, so I put the 8127 in that system, which as you can see is a bit slower than the AQC, but works well enough.
 
Last edited:
It was because Nvidia was so far ahead that they could blow so much money on experimental tech and pay devs to put it in their games. It's not that Nvidia was right, they were the ones steering the ship.
AMD has been the lead platform for nearly every big budget video game made for 20 years, and this position has only improved since Sony dumped NVIDIA a decade ago. The idea that they can't lead because devs wouldn't support them is completely inverted from reality. They could be in the driver's seat and choose not to be.
 
AMD has been the lead platform for nearly every big budget video game made for 20 years, and this position has only improved since Sony dumped NVIDIA a decade ago. The idea that they can't lead because devs wouldn't support them is completely inverted from reality. They could be in the driver's seat and choose not to be.
I'm not sure owning the console market is going to be the win AMD thinks it is.
 
I'm not sure owning the console market is going to be the win AMD thinks it is.
Keeps them in the black. About half the consumer GPUs shipped every year go in game consoles. They'd probably be out of the market entirely if it weren't for Sony.

Imagine if NVIDIA had won those deals, and PS5 had an Ampere chip in it. Aside from having to watch Jensen Huang suck his own dick even more, we'd see a lot more games built ground up for raytracing.
 
Back
Top Bottom