GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Hey guy, I'm a programmer and I've written driver code. Look up "hooking the system service table" sometime to see what I mean. I reverse engineer shit all the time.

AMD is shit for driver code. I use to run AMD's but I don't anymore because I don't want to deal with their shit.
That doesn't really mean anything as far as the usual "my pc sucks, it's the drivers" complaints go.

Driver issues are very much exaggerated. The vast majority of people do not isolate issues.
 
That doesn't really mean anything as far as the usual "my pc sucks, it's the drivers" complaints go.

Driver issues are very much exaggerated. The vast majority of people do not isolate issues.
Yes I know. I've had to deal with that shit in my career. People are dumb. They can't even run sys internals or open a console prompt.

Most people are dumb as shit as far as computers go. The driver issue was very bad security wise back in the day before we were forced by microsoft to go thru the signing program.

Most of the bugchecks came from third party driver code, and not microsoft.

That's what they are called "bugcheck" becuase that is the two kernel calls you make to halt the system,

Instead people call the blue screen of death, but you can change the colors in system.ini.


Down load windbg (user mode and kernel mode debugger, better than softice use to be) and you can read the bugcheck codes from there.

You can boot you machine into four different configs...one of them allows you to boot and load unsigned drivers,so we can test our code, but if you reboot it defaults to signed drivers.

You also have to download the checked version of the OS (the checked version has additional debug code along with the symbols.

I know my shit and AMD still sucks compared to nvidia driver writers. Sorry for the truth. I like AMD because intel and nvidia need compatition.
 
Last edited:
I know my shit and AMD still sucks compared to nvidia driver writers. Sorry for the truth. I like AMD because intel and nvidia need compatition.
I was actually going to do an edit, but eh, making breakfast for toddlers popped up first lol. Anyways, I was going to say for a coder, I could believe not liking AMD shit because I totally believe it's not written as well.

Being only an end user, I only really care about the actual stability on my pc. Performance, there might be something there. A lot of games skew towards AMD or Nvidia by a large variety of margin, so idk. I wouldn't say that a game performing 20% better on equivalent AMD hardware than Nvidia is because of the Nvidia drivers. As for stability, I think that is completely overblown.

So yeah, I'd probably agree the writing is worse. The actual end result is fine, however. Until very recently I've been pretty much 50/50 and can say for certain I've had neither more or less stability between Nvidia or AMD.
 
  • Like
Reactions: Piss Clam
You sure it's drivers and not thermal? I had a 5700 XT do weird stuff like that until I stopped trying to drive it at 4K.
Pretty sure. I was running something super non intensive, like doesn't break 50c with fans off. Another game it does it in specifically will say "Graphics driver was hung then restarted" when it comes back.
 
Recently built a server with a 3090 GeForce RTX. 24gb of ram should allow me to do stuff with the GPU. I'm mostly thinking of writing cuda physics simulations and training ML models.

I keep seeing all these other gpus on the market for $insanity. Their specs don't look crazily different from what I managed with this used RTX though - roughly similar memory (~16 GB or so) for something like a V100 GPU. Maybe the clock is slower? Global memory access is slower? I dunno.
 
  • Like
Reactions: Piss Clam
Just had my worst ever driver crash. Looked like this for about a minute until it came back.
View attachment 5065631
Been having this issue for about 2 years, usually the screens will just go grey/white for a couple seconds and come back.
I thought NBC's master control center would be more... complicated?

I would say that it's a hardware flaw, but if the card hasn't died yet and doesn't give you a lot of heartache, you might as well ride it out until you feel the need to upgrade.
 
I thought NBC's master control center would be more... complicated?

I would say that it's a hardware flaw, but if the card hasn't died yet and doesn't give you a lot of heartache, you might as well ride it out until you feel the need to upgrade.
Yeah, I think 10 years is pretty good for a GPU life, especially a $350 card. Gonna ride it out for as long as practical. Have a friend who had two SLI'd 970s for a while, but they randomly both died, told me he'd just give me one if they didn't. The need to upgrade is certainly there, but I am a broke nigga lol.
Side note, is SLI even still practical?
 
  • Like
Reactions: Just Some Other Guy
As said above, sli is all but dead. Until the 4090's came out, 1080 ti in sli held many of the world records for overclocking.
 
  • Like
Reactions: Freedom Fighter
Hey guy, I'm a programmer and I've written driver code. Look up "hooking the system service table" sometime to see what I mean. I reverse engineer shit all the time.

AMD is shit for driver code. I use to run AMD's but I don't anymore because I don't want to deal with their shit.
that's nice but means fuck all if the hardware is faulty affecting the GPU.
 
  • Like
Reactions: Ether Being
Nope.
Vulkan does theoretically support multi-GPU rendering, but I'm not aware of any games that make use of it. That's explicitly not SLI though, you don't need a bridge (or even two Nvidia GPUs, theoretically it should work with just about anything).
Some games supported it, like here:
Strange Brigade was in many ways (like DOOM) a showcase for what sort of performance and visuals can be achieved with proper optimizing.
Other games I vaguely remember as able to use mGPU, Ashes of Singularity, some Tomb Raider and some Deus Ex (not 100% sure, but Ashes I'm 99% sure).
 
Tomb Raider
Now that you mention it, yeah, Rise of the Tomb Raider did pin both my GPU and my iGPU when I ran it.
Not that I think the iGPU contributed much in this scenario. I imagine in most cases with such a huge disparity, all the extra GPU does is worsen your lows.
 
  • Like
Reactions: AgendaPoster
Given how ancient some of these games are, I think it shows that graphics just don't matter like they did in the old days. Imagine if in 2010, the #1 multiplayer game was still UT '99.
I don't think it's a new thing. @The Mass Shooter Ron Soye said multiplayer games have been doing that for a while, but so do MMOs.

My tactic for building a PC from 2008 onwards was to target Xbox 360 specs since 99% of games were made with consoles in mind, so as long as you could equal (preferably beat) console specs, you could play games on medium settings comfortably for a good price.


On an unrelated note, the hot summer weather is starting in the UK. I'm tempted to underclock the CPU and maybe even the GPU so it stays cool and doesn't act as a space heater. I've not come close to maxing out the CPU or GPU outside of a few specific situations like messing with Half-Life Raytracing mod, but that also means I don't know if underclocking will have an effect at all.
 
My CPU normally idles around fifty degrees, but with a curve optimiser (not even an underclock, just an undervolt) I’m down to thirty-forty degrees. Definitely worth a try.
 
  • Informative
Reactions: Judge Dredd
I don't think it's a new thing. @The Mass Shooter Ron Soye said multiplayer games have been doing that for a while, but so do MMOs.

My tactic for building a PC from 2008 onwards was to target Xbox 360 specs since 99% of games were made with consoles in mind, so as long as you could equal (preferably beat) console specs, you could play games on medium settings comfortably for a good price.


On an unrelated note, the hot summer weather is starting in the UK. I'm tempted to underclock the CPU and maybe even the GPU so it stays cool and doesn't act as a space heater. I've not come close to maxing out the CPU or GPU outside of a few specific situations like messing with Half-Life Raytracing mod, but that also means I don't know if underclocking will have an effect at all.
You just need a big cooler like the Scythe Ninja or Fuma or the new Deepcool chonky units. More thermal mass is the answer.
 
r7260x, rx 590 and now the 6700xt and i have never had a problem with amd drivers
Same, I've been using AMD shit for years and never had issues but I'm not exactly a power user and don't make a lot of "under the hood" changes. I will say that the radeon adrenalin software is kind of annoying to use and feels like it was programmed and designed by people who don't have to actually use it. Bloated and slow.
 
You just need a big cooler like the Scythe Ninja or Fuma or the new Deepcool chonky units. More thermal mass is the answer.
A bigger cooler will just transfer more heat from the CPU to the room. It’ll let the CPU boost higher without throttling, raising power consumption and heat generation. If he wants his room cool he should address the problem at its root, by undervolting and underclocking. Lowering performance a few percent will lower power consumption significantly more, as most of the power draw is from pushing high clocks. Locking the CPU at a moderate clock will let it run at very low power levels.
 
A bigger cooler will just transfer more heat from the CPU to the room. It’ll let the CPU boost higher without throttling, raising power consumption and heat generation. If he wants his room cool he should address the problem at its root, by undervolting and underclocking. Lowering performance a few percent will lower power consumption significantly more, as most of the power draw is from pushing high clocks. Locking the CPU at a moderate clock will let it run at very low power levels.
It will also allow the CPU to run a cooler without sacrificing performance. The heat it dissipates isn't really that much to feel on your skin. I wish my CPU would heat my room during the winter.
 
Back