GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

I don't think it would hurt me fully if I upgraded to a RX 6600 or a baseline 4060. I mean, my motherboard is running PCIe 3.0, but it's still running strong, and I'll probably use it until it falls apart.
An infamous example of PCIe 4.0 mattering is the 6500 XT, since it only has 4 lanes:


20-35% losses are possible, sometimes taking framerates from barely playable to not, and lowering the 1% lows (stuttering).

If you keep upgrading the GPU, eventually you'd prefer something mid-range with 16 lanes so it can run at PCIe 3.0 x16 instead of x4/x8.
 
An infamous example of PCIe 4.0 mattering is the 6500 XT, since it only has 4 lanes:


20-35% losses are possible, sometimes taking framerates from barely playable to not, and lowering the 1% lows (stuttering).

If you keep upgrading the GPU, eventually you'd prefer something mid-range with 16 lanes so it can run at PCIe 3.0 x16 instead of x4/x8.
yeah I remember that one.. more of AMD being cheap arses with the 6500XT using 4x lanes like an nvme drive, than something that lionises PCIe 4.0/5.0.. Had they used 8x lanes, which is already cut down cheap videocard shit; then this never would have been nearly such an issue even on ancient PCIe 3.0.
But PCIe 4.0 has been around for 4 years now and we have PCIe 5.0, life moves on.
 
I've been watching a lot of PCIe 3 vs PCIe 4 comparisons, and honestly, if I wasn't told which was which, I doubt I could actually tell the difference between the two. I only say this because I built my (initial) $550 machine back in 2020 with a Ryzen 3 3200G + RX 570. I've since upgraded to a newer gen Ryzen 5 5600 + RX 580. I've been kind of paranoid upgrading my card to something newer (due to the support for the 580 shutting down) because of the previous gen. From what I've noticed, there's maybe a 5/10 frame drop between generations, and it looks like there's diminishing returns in a lot of modern titles. I don't think it would hurt me fully if I upgraded to a RX 6600 or a baseline 4060. I mean, my motherboard is running PCIe 3.0, but it's still running strong, and I'll probably use it until it falls apart.
If it makes you feel any better - even the 4090 only sees like a 10% performance degradation if you run it on PCIe 3 instead of PCIe 4. And we don't even have PCIe 5 graphics cards yet.
 
  • Informative
Reactions: Brain Problems
If it makes you feel any better - even the 4090 only sees like a 10% performance degradation if you run it on PCIe 3 instead of PCIe 4. And we don't even have PCIe 5 graphics cards yet.
The 4090 isn't a great example. Graphics cards use the PCIe slot primarily for two things; receiving instructions and loading textures/AI models. The 4090 has sufficient VRAM that you can essentially just load all the data once and then let the card keep them in its own memory, so other than right at the start of a task, it won't use the PCIe slot heavily, because instructions are tiny. A better example would be a card with much less VRAM, which needs to swap data back and forth between video memory and system memory, such as the RTX4060 8GB. Dropping down a PCIe generation will halve the available bandwidth, which means swapping data will take significantly longer, which can manifest as stuttering, which is exacerbated by the card only being x8 wide to begin with, meaning you're actually only getting the equivalent of PCIe4x4 rather than the PCIe4x8 a 4090 would have. An even better example is a card with only an x4 wide connector, such as the RX6500XT. Dropping that to PCIe3 means you're actually only getting the bandwidth of PCIe4x2, which is a huge performance impact and a massive problem for a budget card more likely to be installed in an outdated motherboard than a higher-end card would be.
 
I've been watching a lot of PCIe 3 vs PCIe 4 comparisons, and honestly, if I wasn't told which was which, I doubt I could actually tell the difference between the two.

In-game, the PCIe bus doesn't get hit very hard. Very little data is sent to the GPU on a frame-by-frame basis other than instructions. You might have heavy data streaming in the background for open-world games and the like, but PCIe, even Gen 5, is so slow compared to 60+ fps that you need to start loading that data long, long before the player sees it.

To put it in perspective, a 1024x1024x24b texture is 24 Mb. PCIe4 is 16 Gbps, so if you've got 4 lanes, at 60 fps that translates into being able to load a grand total of 45 textures over PCIe4 on the fly. That's basically nothing. Double it, and it's still basically nothing.
 
Last edited:
I mean, this makes me feel a little better about upgrading from a RX 580 going forward. I guess I just needed some confirmation from other folks that it'd be okay. This was honestly my first gaming rig I put together with my own hands, and I'm always paranoid about tinkering with it. I got my Gigabyte board for a really good $74.99, and I didn't realize it was only PCIe 3 until much, much later.
 
  • Like
Reactions: Shao Khan
mean, this makes me feel a little better about upgrading from a RX 580
Yea you really should upgrade from that, mainly because the RX 580 is no longer getting driver updates, but also does not support mesh shaders.
 
A simple question not worth its own thread:

I've found a case that will do and is cheap enough but it comes with ARGB fans. I fucking hate RGB lighting. Is it possible to disconnect the RGB part and just not have them illuminated? Like does the LED have a separate cable or one I can easily find and cut? I don't care if the operation is destructive - RGB lighting is never something I will want.

(I know I could just replace the fans but I don't want to add more expense to this build)
 
A simple question not worth its own thread:

I've found a case that will do and is cheap enough but it comes with ARGB fans. I fucking hate RGB lighting. Is it possible to disconnect the RGB part and just not have them illuminated? Like does the LED have a separate cable or one I can easily find and cut? I don't care if the operation is destructive - RGB lighting is never something I will want.

(I know I could just replace the fans but I don't want to add more expense to this build)
Yea. ARGB has a connector that looks like this.

1727767320595.jpeg

If you plug the normal fan headers in, but leave that connector unconnected to the mobo or fan hub, then you should not have any RGB lighting from the fans.
 
Yea. ARGB has a connector that looks like this.

View attachment 6475379

If you plug the normal fan headers in, but leave that connector unconnected to the mobo or fan hub, then you should not have any RGB lighting from the fans.
Thanks. New case choices unlocked!

I don't know why anybody likes RGB lighting on their components. It's entirely alien to me. I aalso don't like the way for a while all electronics had some actinic LED added for no good reason. My AVR has that and all it does is fucking shine in your eyes when you're trying to watch a movie. I have a blob of blu-tac over it.

I LIKE THE DARKNESS!
 
I don't know why anybody likes RGB lighting on their components.
A significant factor is that the fan people are looking for things to add to the product as an upsell

Their theory to my understanding is they feel they are already selling about as many fans as they possibly can, so they have to up the price to increase revenue. In their minds the value of the product sets the maximum price, so if adding RGB even plausibly appears to add value, then the customer will accept the price going up by a few bucks. Then revenue goes up (even if profits dont), company value goes up (commonly accepted as annual revenue x 10 years), stock value more or less follows, mission accomplished
 
Last edited:
A significant factor is that the fan people are looking for things to add to the product as an upsell to up revenue since they are already selling about as many fans as they possibly can

Their theory to my understanding is that the base value of the product sets the maximum acceptable price. If adding RGB even plausibly looks like it adds value then the customer would accept the price going up by a few bucks, at which point revenue goes up (even if profits dont), then company value goes up (commonly accepted as annual revenue x 10 years), stock value follows, mission accomplished
Oh sure, it all makes sense from the company's point of view. They're trying to convince me that I want something. It's just such an obnoxious thing why would I?
 
Thanks. New case choices unlocked!

I don't know why anybody likes RGB lighting on their components. It's entirely alien to me. I aalso don't like the way for a while all electronics had some actinic LED added for no good reason. My AVR has that and all it does is fucking shine in your eyes when you're trying to watch a movie. I have a blob of blu-tac over it.

I LIKE THE DARKNESS!
With ARGB you can easily colour-match with your other decor. Rather than having to specifically buy a red motherboard, a red cooler, red fans etc, you can just set the ARGB to red and get the same effect. If you decide to switch your desk decor to blue, colour matching your computer is as easy as opening OpenRGB, finding a nice blue hue, and hitting save. If you care about how your computer looks, it's very flexible. You can even make it "useful", such as having it turn on a gently pulsing light if you've got unread email or whatever.

Comparing it to the ubiquitous bright blue LEDs of yesterdecade isn't accurate, it's usually rather easy to turn ARGB off and it specifically is never going to clash with anything else you're doing, since it's adjustable. I still wouldn't pay extra for ARGB (unless the alternative is a fixed-colour light), but it does have good aspects. Such as an off-button.
 
With ARGB you can easily colour-match with your other decor. Rather than having to specifically buy a red motherboard, a red cooler, red fans etc, you can just set the ARGB to red and get the same effect.
Why do I get the feeling that you have a blood red colour scheme for your system and type all your posts from a room with Mordor-like lighting?

susana.png

If you decide to switch your desk decor to blue, colour matching your computer is as easy as opening OpenRGB, finding a nice blue hue, and hitting save. If you care about how your computer looks, it's very flexible. You can even make it "useful", such as having it turn on a gently pulsing light if you've got unread email or whatever.
This is both a clever idea and horrifying as yet another way to remind me of the number of things I have to respond to at all times!

Comparing it to the ubiquitous bright blue LEDs of yesterdecade isn't accurate, it's usually rather easy to turn ARGB off and it specifically is never going to clash with anything else you're doing, since it's adjustable. I still wouldn't pay extra for ARGB (unless the alternative is a fixed-colour light), but it does have good aspects. Such as an off-button.
Okay, well I guess I'll get the case then. I'm not paying extra for the lighting - it's cheaper than the next best case without it. I suppose I can set it to some obnoxious rapid cycle when I want to encourage a guest to leave.
 
Why do I get the feeling that you have a blood red colour scheme for your system and type all your posts from a room with Mordor-like lighting?

susana.png
I don't believe in RGB, but if you must have your computer cave lit at all, it should be by red lights, to enhance health and wellness. Personally I just use a clipon red LED light plugged into my TV, and adjustable color LED smart bulbs in desk lamps.
 
Why do I get the feeling that you have a blood red colour scheme for your system and type all your posts from a room with Mordor-like lighting?

susana.png
Sadly no, I've got the boring standard oak sit-stand desk, leather desk mat, black computer, IBM keyboard, 48" OLED TV combo. I do have the ARGB configured to annoyingly flash bright red if the coolant temperature hits 70 degrees, to alert me if there's a cooling failure.

Black and garnet are my go-to colours though.
 
Back