Nvidia's in a position where they don't really have to care about gaming performance at all.
And yet, they keep delivering the best gaming performance.
I think it was around the 30 series when their rasterization wasn't any better than the previous gen because they were prioritizing ray tracing which led to a small controversy.
It was the 20 series. NVIDIA bet the future of performance was in more realistic lighting and inferencing acceleration, not another 10% more raster pipes...and they were right. Just like when back in 2001, they bet that the future of gaming was programmable pipelines, not just a bigger, fatter T&L unit.
Then with the 50 series, they dropped PhysX which broke games
No, they didn't "drop PhysX," and it didn't "break games." They dropped the 20-year-old 32-bit version, which was only used by a handful of dead 360-era games with virtually no active players, like Mirror's Edge. And it didn't break them, you just had to turn off the PhysX option (which never worked on AMD anyway).
The only reason anybody knew PhysX didn't work in Mirror's Edge was Vex was shrieking and crying about it. People love to get mad about games they don't play.
while barely offering much of a performance uplift on the lower end
For the nonexistent gamer who can't afford a midrange card, but buys a brand-new low-end card every year, it was a disappointment. For normal people who buy a card every 3-5 years, it was and is better than anything AMD has out in the same tier.
I don't understand what point you are making the 6700XT. I think was cheaper/or the same price as 3060Ti (which is the equivalent card). There are pros and cons about buying either card.
The "equivalent" to the 6700 XT was the 3070 Ti, and the point I'm making it was a piece of shit. The RDNA2 cards all had 20%-30% more fill rate than Ampere...but then AMD cut corners on the memory interface, so they didn't actually have enough bandwidth to use it. They slapped a few more low-cost memory chips on the board, but again, not enough bandwidth for it to be useful. I continually found that any game that actually used 12 GB of VRAM would just shit its pants until I cut back the texture resolution to keep it in the 8 GB envelope. So what was even the point? Their raytracing performance was so bad that AMD even saying they could raytrace was regarded as a borderline lie (literally useless, I couldn't even run Diablo IV on low RT settings without random 10 fps dips), and of course, FSR just plain looked like shit, while NVIDIA had just launched a new version of DLSS that looked borderline indistinguishable from native res. All-around just a garbage card; wish I'd known the people recommending AMD were just fanboys hyping up dog shit.