GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

I've got the latest video driver, chipset driver, and the power profile is set to max performance. I don't see how any of that can be the cause when the problem is 100% exclusive to OpenGL, though.
Is it even running on the GPU? What's the CPU doing at this time, is it 100%?
Looks like there's something called "OpenGL Extensions Viewer" that should show the status of OpenGL rendering and drivers.
 
Is it even running on the GPU? What's the CPU doing at this time, is it 100%?
Looks like there's something called "OpenGL Extensions Viewer" that should show the status of OpenGL rendering and drivers.
No, CPU is loping along at moderate usage that you'd expect when running an older game.

I don't for sure what I should be looking for in OpenGL Extension viewer, but it says all OpenGL features through version 4.6 are supported by the GPU and that it's running the 1/2024 AMD driver.
 
What's the thread's consensus on the best GPU from a power-to-budget ratio? I recall the 4070TI was bandied about for a while but I cannot recall and I really only understand AMD.

IMO, Intel's Arc GPUs tend to be the best for the dollar since the 2022 driver update, but they don't have anything out on par with a 4070 Ti. Rumor is their 2nd gen card won't be out until 2025.
 
  • Informative
Reactions: WelperHelper99
What's the thread's consensus on the best GPU from a power-to-budget ratio? I recall the 4070TI was bandied about for a while but I cannot recall and I really only understand AMD.


You could really regret not getting a 4070 if something Nvidia/CUDA specific comes along comes along and you need it (VR, AI, etc).

I grabbed one of the rock bottom RX 6800/16GB MSI cards last year and performance wise it's good enough, and most importantly avoids obsolescence from limited RAM.

Some benches say performs within 10% of a 4070 depending on task which at the time was a big deal. Less of a big deal now with some 4070 refreshes and discounts supposedly coming. The 6950XT is rare and was really the one to get for price/performance, but sells out instantly in my neck of the woods. 6800XT is also good but has a disproportionate premium and might as well get the 6950 if possible. Newer 7800 are a good idea for longevity but not exactly a giant leap forward.
 
Is this a known problem? I know that AMD's OpenGL support has traditionally been poor, but this seems less like a mere efficiency issue and more like a really obvious, severe bug.
OpenGL performance on AMD has always been complete garbage, linux users claim that their drivers run fine but i haven't checked it myself
 
Checking prices recently and it's crazy that the 4080 Super is now going for close to $1500. I guess people really are willing to buy that card for 50% more than MSRP when there are no 4090s. Makes me a bit glad I snatched up a 4080 for $1100 last year.

Moral of the story: never sit around waiting for better deals. I imagine we're going to get another 10 months of seetheposts in the usual places from dudes who were 'waiting for the right time to buy.'

You could really regret not getting a 4070 if something Nvidia/CUDA specific comes along comes along and you need it (VR, AI, etc).
I'd also add - the 4070 Super supports all the whizbang nvidia features like DLSS3 which actually makes it a fantastic card if you're trying to squeeze out higher framerates, and it's only like $50-$100 more than comparable AMD offering.

Like honestly, despite what the AMD fanboys say, the current lineup is not really that much more affordable than nvidia in the midrange.
 
Last edited:
I'd also add - the 4070 Super supports all the whizbang nvidia features like DLSS3 which actually makes it a fantastic card if you're trying to squeeze out higher framerates, and it's only like $50-$100 more than comparable AMD offering.

Like honestly, despite what the AMD fanboys say, the current lineup is not really that much more affordable than nvidia in the midrange.
It's why I'm seriously looking at it now that I know what I'm getting from taxes. The 4070 Super hits hard. It's a pretty solid all rounder.
 
  • Like
Reactions: Fcret
4070 Super is finally an NVidia GPU from this generation that I could consider buying. But, Blackwell is probably under a year away, so I'm in that perpetual waiting game again.
 
  • Like
Reactions: WelperHelper99
4070 Super is finally an NVidia GPU from this generation that I could consider buying. But, Blackwell is probably under a year away, so I'm in that perpetual waiting game again.
If you need a GPU now, don't wait for Blackwell. At that point you're talking about 10+ months without one just for the possibility that Blackwell is going to be good (possibly not) and cheap (almost certainly not).

At a certain point, you have to start weighing the cost of not having a GPU for 10 months into your value proposition.

Of course, if you do already have a GPU that does everything you want, then definitely wait for Blackwell. Or potentially wait for RDNA4 which supposed to be targeting 4070ti/4080 performance at a much cheaper price.
 
AMD cultists have always been the least helpful niggers ever.
>never used AMD cards since late 2000's
>buy old 6600xt during pandemic because it was the only card available
>drivers and display keep fucking up
>ask for assistance
>"its your powersupply"
>"it's your motherboard"
>"it's your RAM"
>"it's your CPU (unless it's an AMD cpu, of course)"
>"it's literally anything but the GPU"
>call the gpu shit
>"DO NOT BE MEAN TO RED COMPANY"

Say what you will about nvidia, but at least their cards actually work.
because they're right. in my experience nvidia cards are more tolerant, but there's a good chance it will break for good because of it (muh "it ran fine until it didn't"). issues like that can point to a problem elsewhere. and you always start from the beginning (hence the "sfc /scannow" copypasta).
just as an example, amd gpu constantly greyscreening and crashing. hey nvidia is gonna fix it right? it did, before it died. meanwhile the amd gpu works fine with zero issues in another pc (still does funny enough). turns out it was the board, not the gpu.

and just like nvidia it depends on vendor. don't claim nvidia has shit brands too. buy a good one like sapphire and you will have less issues.

really, some people should do tech support for a bit, the shit people come up with would blow your mind (like having to explain retards that their prebuilt PSU won't be able handle the GPU upgrade under load). fanboi-faggotry comes way later...

I would expect "full dive" VR before anything else, VR goggles with hand controls will be awkward for basically anything not actually designed around sitting in a chair (Elite Dangerous, racing games) and "full immersion" with a harness and a treadmill platform is super impractical. Frankly, brain implants that let you project a virtual reality seem more feasible even though they're pure sci-fi, than the laughably clumsy excuse for VR we have now.
not gonna happen for decades, hence the impractical workarounds.

For me, the big problem with VR is there's always some major compromise you have to make depending on which headset you choose. Deckard might be the first complete package, but I'm expecting an astronomical price tag.
there's hope valve learned from their knuckles "experiment" that no one really needs/wants it (if anything it's more of a professional/industrial feature, not consumer), and apparently inside-out tracking so more no more paying for lighthouse-boxes (in exchange for worse tracking).

I seriously blame journos and other retarded niggers that early one were adamant you need a fucking vibrating phallus stick for IMMERSHUN, because that totally makes sense grabbing anything but A FUCKING STICK. remember when valve thought teleportation is the future and designed the vive wands that way?
people automatically use hands for everything, the focus should've been from the start to get proper gloves or handtracking working. added benefit is any force-feedback can be separated - I wouldn't need an expensive and proprietary gun controller, I'd just duct-tape together a bumpstock/trigger to grab and be done with it.
 
A glove would also break really easily. A better solution would be to just have some sort of informal standard for wands, like exists with joysticks/HOTAS. You could have wands shaped like sticks for things like sword fighting or bat-ball, or ones shaped like pistols for shooters. Clip on a plastic stock and your pistol wand is now a rifle wand. Etc.
 
because they're right. in my experience nvidia cards are more tolerant, but there's a good chance it will break for good because of it (muh "it ran fine until it didn't"). issues like that can point to a problem elsewhere. and you always start from the beginning (hence the "sfc /scannow" copypasta).
just as an example, amd gpu constantly greyscreening and crashing. hey nvidia is gonna fix it right? it did, before it died. meanwhile the amd gpu works fine with zero issues in another pc (still does funny enough). turns out it was the board, not the gpu.
Nah niggers just want to bitch and not do proper trouble shooting.

"Da internetz said AMD drivers suck, therefore any issue that pops up must be that!"

"It just works" mentality has made consumers more retarded. Seeing the same stuff happen in 3d printers.
 
Nah niggers just want to bitch and not do proper trouble shooting.
GPU drivers from both nvidia and AMD continue to be shit and no amount of complaining about users is going to change the fact that the most likely driver to BSOD your system continues to be one for a product you're spending several hundreds of dollars for.

Both companies need to get their shit together.
 
  • Agree
Reactions: Second Sun
The report nobody asked for - keeping my CPU throttled at 65W for most part had no effect on my experience. There were two notable exceptions. One was a few games had visibly longer load times, expected because decompressing assets happens on the CPU. But the only game that was affected during gameplay was Warhammer: Darktide. I have suspected that the game imposes a massive CPU burden because it is constantly streaming in & decompressing assets on the fly rather than precaching in RAM, and this seemed to confirm it, as often, my weapon model would take a second to load after I switched, and there was a lot of stuttering right before a horde would appear. That it's the only exception, and that a modder managed to fix those kinds of issues in the main area with precaching suggests it's just an exceptionally bad piece of software (but a very fun game, highly recommend).
 
If you need a GPU now, don't wait for Blackwell. At that point you're talking about 10+ months without one just for the possibility that Blackwell is going to be good (possibly not) and cheap (almost certainly not).

At a certain point, you have to start weighing the cost of not having a GPU for 10 months into your value proposition.

Of course, if you do already have a GPU that does everything you want, then definitely wait for Blackwell. Or potentially wait for RDNA4 which supposed to be targeting 4070ti/4080 performance at a much cheaper price.
After looking at the relatively low cost of the 4070 super and it's performance, I tend to agree. It's doing more than what a 3080 used to do more efficiently. Waiting for a entire new architecture when the 40 series is still doing extremely well, is honestly a waste.
 
  • Informative
Reactions: Brain Problems
After looking at the relatively low cost of the 4070 super and it's performance, I tend to agree. It's doing more than what a 3080 used to do more efficiently. Waiting for a entire new architecture when the 40 series is still doing extremely well, is honestly a waste.

Just buy whatever the best thing is for whatever your budget is now. If your idea is to wait until there isn't something better 6-8 months away from release, then you will literally never, ever buy a GPU.
 
Just buy whatever the best thing is for whatever your budget is now. If your idea is to wait until there isn't something better 6-8 months away from release, then you will literally never, ever buy a GPU.
By the end of the week, combined with my tax returns, it's either a stock 4070 or a 4070 super. Either one is overkill for my needs.
 
Intel announcing they're splitting the product and foundry business is interesting.

Nvidia were desperate to ditch TSMC but going with Samsung went terribly, so they had to come crawling back which is a big part of the reason the 4000 series is so expensive. Rumors are saying that Intel might be able to get ahead of TSMC in the next few nodes though, so Nvidia switching to them is a real possibility (but I wouldn't put money on Intel not fucking things up royally).

The flip side is the AI nonesense has so much money behind it, none of the companies really care about anything else. Latest Nvidia report had AI being 6 times the revenue of everything else they're doing. Intel's announced an all e-core server cpu, and AMD has the Zen4c and plans to go further with Zen5c (which is a bigger deal when you think about it, since the whole basis of the success of zen has been the idea of making one identical chiplet and just binning across the whole product range).

I still think that AI is going to prove to be a bubble, and all the money governments have poured into bribing foundries being built is going to cause an enormous oversupply within the next 5 years, but that still doesn't mean we're going to end up with fast and cheap consumer chips.
 
Back