GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

I just can't get my new AMD card to function properly, apparently that's a common problem.

If I have anything on my second monitor, the frame rate on my main screen drops to 15 or less. It feels like playing Skyrim on a PS3.

I have done everything, using Edge, turning hardware acceleration off, changing the refresh rate to 120 or 60, turning off freesync and so on.

I just don't know anymore what to do. This shit just fucking hates more than one monitor.

I really don't want to get super mad right now because I still believe I'm a fucking retard and I'm doing something wrong. But seriously, I'm very close to understand why AMD just doesn't fucking sell.
On one hand, Nvidia are greedy cunts who are ready to abandon their original market for AI, on the other hand is AMD whose GPUs are like archaic schizo-tech in whether they work or not. Oh and Intel is still there with their Arc GPUs.
 
  • Agree
Reactions: The Ugly One
Plugged my second monitor on my motherboard. The problem still persists.

I'm considering returning this piece of shit. I really don't know anymore what could be causing this. It shouldn't be a massive problem like that.
 
  • Feels
Reactions: StacticShock
Plugged my second monitor on my motherboard. The problem still persists.

I'm considering returning this piece of shit. I really don't know anymore what could be causing this. It shouldn't be a massive problem like that.
you can use a normal GPU and iGPU at the same time?
 
Plugged my second monitor on my motherboard. The problem still persists.

I'm considering returning this piece of shit. I really don't know anymore what could be causing this. It shouldn't be a massive problem like that.
This is probably an obvious question, but did you try swapping out the monitor cables?
 
I just can't get my new AMD card to function properly, apparently that's a common problem.
Have you uninstalled the old drivers fully with DDU before installing the (new) AMD stuff?
They have a tutorial on how to do that properly.
In your case, i would recommend uninstalling all GPU related drivers from both AMD and Nvidia and then doing a clean re-install.
Also, check if your Windows is up-to-date and not behind several updates. That can also cause weird issues when installing new hardware.
 
I got an RX6600 cause they were available and seemed good enough.
Did I get scammed?
 
Have you uninstalled the old drivers fully with DDU before installing the (new) AMD stuff?
They have a tutorial on how to do that properly.
In your case, i would recommend uninstalling all GPU related drivers from both AMD and Nvidia and then doing a clean re-install.
Also, check if your Windows is up-to-date and not behind several updates. That can also cause weird issues when installing new hardware.

AMD's driver management software is so broken that I had to purge my machine of all traces of AMD software to get the new Adrenalin update. I've successfully driven 2 monitors with both my 6700 XT and 5700 XT, so I know it can be done.

Currently, I use my iGPU for the second monitor.
 
Dammit AMD, as much as I hate Intel...
"And then will I profess unto them, I never knew you: depart from me, ye that groom.".
 

Attachments

  • Screenshot 2023-06-27 171710.png
    Screenshot 2023-06-27 171710.png
    1.3 MB · Views: 49
With how Navi 33 has disappointed, I'm left wondering: Why didn't AMD decide to work on a die shrink of Navi 22 to N5, instead of splitting resources on the GCD/MCD and a monolithic die?
 
With how Navi 33 has disappointed, I'm left wondering: Why didn't AMD decide to work on a die shrink of Navi 22 to N5, instead of splitting resources on the GCD/MCD and a monolithic die?

You can't just shrink old designs to new processes any more. Certain features on the die are already as small as they can get, or at least can't be shrunk proportionately with other features. Consequently, every new process node now requires a whole new design, and EDA companies that provide tools for new designs are now raking in money.

I'm gonna make an outlandish prediction here - discrete GPUs are a dying breed, and we're going to start seeing iGPUs, which already dominate 3D gaming on mobile, start to penetrate desktop/laptop gaming in a big way very, very soon.
 
I'm gonna make an outlandish prediction here - discrete GPUs are a dying breed, and we're going to start seeing iGPUs, which already dominate 3D gaming on mobile, start to penetrate desktop/laptop gaming in a big way very, very soon.
I doubt it, unless on-die HBM suddenly becomes quite a lot cheaper. Phones get away with bad memory speeds with some very tight memory controller integration, but we're unlikely to move away from RAM DIMMs in consumer PCs any time soon. All gamers care about these days is ridiculous frame rates, and that takes high memory speed, which means either HBM or GDDR. Those of us who don't really care about shaving off 5 ms of latency by rendering at 280fps or whatever instead of "just" 120 already game just fine on iGPUs and low-end/obsolete dGPUs, but we're not the ones driving the GPU market in the first place, as evidenced by how most of the people surveyed by Steam are still using GTX10-series or older.
 
Those of us who don't really care about shaving off 5 ms of latency by rendering at 280fps or whatever instead of "just" 120 already game just fine on iGPUs and low-end/obsolete dGPUs, but we're not the ones driving the GPU market in the first place, as evidenced by how most of the people surveyed by Steam are still using GTX10-series or older.

I'm drawing the opposite conclusion - the dGPU market is declining as normal people lose interest, proving that "gamers" don't really drive that many sales. Gaming is bigger than ever, but specialized gaming hardware is at its lowest point in 20 years.

1688265924437.png

And note at the tail there - the decline of desktop dGPUs at the end of 2022 is matched by a rise in desktop iGPUs.

1688266040783.png

I think these numbers presage a collapse. 3D GPUs increasingly look like an overshot market, with RTX having about as much market impact as EAX did with sound cards and XGA did with 2D graphics cards, both of which died out in short order.
  1. Integrated graphics are starting to look "good enough," meaning the average person increasingly feels that the games he wants to play look good enough on iGPUs to not be interested in a $250-$400 add-on.
  2. New graphics just aren't driving market enthusiasm. Raytracing was a wet fart in the marketplace and isn't selling hardware or software.
  3. XeSS and FSR have introduced a qualitative shift in the visual fidelity iGPUs can generate. With FSR on, an RDNA2 iGPU can already run newer games at 60 fps & 1080p with reasonably appealing fidelity. Intel UHD is still pretty dodgy, but they seem to be learning fast.
  4. dGPUs are fleeing up-market, which is often what companies do right before product lines die. An iGPU from Intel or AMD adds a $25-$30 uplift over the base model CPU. dGPUs can't touch that kind of value, so they're just getting more and more expensive to try and differentiate.
 
You can't just shrink old designs to new processes any more. Certain features on the die are already as small as they can get, or at least can't be shrunk proportionately with other features. Consequently, every new process node now requires a whole new design, and EDA companies that provide tools for new designs are now raking in money.

I'm gonna make an outlandish prediction here - discrete GPUs are a dying breed, and we're going to start seeing iGPUs, which already dominate 3D gaming on mobile, start to penetrate desktop/laptop gaming in a big way very, very soon.
Strix Halo could be the beginning of the end of dGPUs, bringing console-like mega APUs to laptops and BGA motherboards. Gaming laptops in particular could become cheaper and more power efficient.

Anything that is limited to 1080p can use Phoenix, Strix Point, or Meteor Lake iGPUs.
 
Yeah iGPU graphics have reached a "good enough" point, especially as most of the new games either are already supported by the higher end iGPUs or are diverse and stunning and brave and nobody wants to play them except for the fanatics.
 
I purchased a RX6950XT recently to replace my aging GTX 1070. I experienced seemingly random crashes despite multiple OS reinstalls - long story short the new PSU I bought alongside the card (850W Corsair) apparently wasn't enough, upgrading to a decent 1000W PSU did the trick, no more crashing.

I was so fucking ready to RMA the card, good thing the situation turned out the way it did. I'm really happy I went with AMD this time around, but If Nvidia ever decides to make decently priced cards again I might hop back, who knows. AMD is good enough at this point that you won't notice a difference except in very specific circumstances.

That being said, the green boys have to work on their software suite, there's alot of cool stuff in Adrenaline that I've come to like.
 
Strix Halo could be the beginning of the end of dGPUs, bringing console-like mega APUs to laptops and BGA motherboards. Gaming laptops in particular could become cheaper and more power efficient.

Anything that is limited to 1080p can use Phoenix, Strix Point, or Meteor Lake iGPUs.

One of my computers is a Ryzen laptop with a 3050 Ti Mobile dGPU...but I find myself switching to the iGPU most of the time because it's just not as hot, and the graphics look fine. If the game supports FSR, 60 fps at 1080p is no problem. My desktop's UHD isn't quite as good, though I've done some testing (has a Radeon 6700 XT anyway). I have no evidence for this, but it feels like AMD decided to put real effort into making its "Radeon graphics" (they seem to have dropped the Vega name) a low-end gaming GPU, while Intel treated UHD as a "There if you need it, I guess." They've been moving really fast lately, though, with the newer versions being based on Arc, and Meteor Lake looking to be a major step forward.

Anyway, here are a bunch of games on a very recent iGPU. Really demanding games are able to run at 30 fps, whereas older games that are still very popular, like CS:GO and Fortnite, just fly.

 
Last edited:
I purchased a RX6950XT recently to replace my aging GTX 1070. I experienced seemingly random crashes despite multiple OS reinstalls - long story short the new PSU I bought alongside the card (850W Corsair) apparently wasn't enough, upgrading to a decent 1000W PSU did the trick, no more crashing.

I was so fucking ready to RMA the card, good thing the situation turned out the way it did. I'm really happy I went with AMD this time around, but If Nvidia ever decides to make decently priced cards again I might hop back, who knows. AMD is good enough at this point that you won't notice a difference except in very specific circumstances.

That being said, the green boys have to work on their software suite, there's alot of cool stuff in Adrenaline that I've come to like.
This is why I went with a 6900xt even though the 6950xt was basically the same price. My Seasonic 750w is able to power it no problem.
 
Back