GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Can't wait to see how many people are gonna line up to consoom next Intel product after all of this "I'm not going to buy Intel again, fuck em" because "but I've only ever bought Intel".
Sadly this may be me when the new shit drops. Except I never said i'd never buy intel again. I just don't want to buy a new AMD compatible waterblock.

:stress:
 
Can't wait to see how many people are gonna line up to consoom next Intel product after all of this "I'm not going to buy Intel again, fuck em" because "but I've only ever bought Intel".
I mean, I might consider them next time I'm up for building a PC. But that probably won't be for half a decade or so. And with how things are going with ARM taking off, who knows if I'll even be building an x86 system next time I need to put together a desktop.
 
  • Agree
Reactions: WelperHelper99
Can't wait to see how many people are gonna line up to consoom next Intel product after all of this "I'm not going to buy Intel again, fuck em" because "but I've only ever bought Intel".
Call me a mark, but I'll likely buy intel again, but that's more cause of the socket I'm currently on and my current cpu cooler.

That kind of makes me realise how better the whole hobby would be if we had a standard CPU socket
 
Ryzen Master is STIR-FLIED ASS HOE! I tried using AMD's "Curve Optimizer" and some other shit and then I ran Geekbench 6: 1722915588175.png
Top result is no OC. Bottom result is with the Ryzen Master "Auto OC."

In other news, I just installed a 3090 into that same HP Omen. I had to use a hammer to destroy the hard drive bays in order for the 3090 to fit, but it mostly works like a dream. Very powerful for running LM Studio, which is why I got it: almost more winning than I can handle.

HOWEVER, it has now crashed twice during Pal World. My laptop 4070 never crashes during Pal World, so why is my desktop 3090 shittier? I'm running the "automatic tuning" in the GeForce Experience app, will that help?
 
Ryzen Master is STIR-FLIED ASS HOE! I tried using AMD's "Curve Optimizer" and some other shit and then I ran Geekbench 6: View attachment 6277120
Top result is no OC. Bottom result is with the Ryzen Master "Auto OC."

In other news, I just installed a 3090 into that same HP Omen. I had to use a hammer to destroy the hard drive bays in order for the 3090 to fit, but it mostly works like a dream. Very powerful for running LM Studio, which is why I got it: almost more winning than I can handle.

HOWEVER, it has now crashed twice during Pal World. My laptop 4070 never crashes during Pal World, so why is my desktop 3090 shittier? I'm running the "automatic tuning" in the GeForce Experience app, will that help?
Sounds like maybe the PSU can't cope? Can you give the actual model number of the machine? It looks like these typically have 800W PSUs which is technically meets the Nvidia recommendations but still kinda borderline for such a high end GPU. A load table would be useful as if it has a wimpy 12V rail then it may be inadequate. Also check fitment of the 12VHPWR adapter. Everything must fit snug and with with no tight bends placing stress on the connectors.

It looks like it also has a 120mm AIO for the CPU which is retarded too. I'm not a fan of pre-built systems from HP, Dell, etc. not many "enthusiasts" rate them well and I would suggest you consider building your next system yourself or getting something from a smaller system builder that just uses off the shelf components.

Have you re-installed Windows on the system since you received it? The HP systems come loaded to the nth degree with crapware and just cleaning that shit out should help. As for OCing, I would check into whether the CPU could benefit from undervolting. The higher end 5000 series Ryzens (like the 5800X) certainly do so that might help it boost higher. Look into PBO2.
 
Sounds like maybe the PSU can't cope? Can you give the actual model number of the machine? It looks like these typically have 800W PSUs which is technically meets the Nvidia recommendations but still kinda borderline for such a high end GPU. A load table would be useful as if it has a wimpy 12V rail then it may be inadequate. Also check fitment of the 12VHPWR adapter. Everything must fit snug and with with no tight bends placing stress on the connectors.

It looks like it also has a 120mm AIO for the CPU which is retarded too. I'm not a fan of pre-built systems from HP, Dell, etc. not many "enthusiasts" rate them well and I would suggest you consider building your next system yourself or getting something from a smaller system builder that just uses off the shelf components.

Have you re-installed Windows on the system since you received it? The HP systems come loaded to the nth degree with crapware and just cleaning that shit out should help. As for OCing, I would check into whether the CPU could benefit from undervolting. The higher end 5000 series Ryzens (like the 5800X) certainly do so that might help it boost higher. Look into PBO2.
I attached a System Information screenshot.

Screenshot 2024-08-06 011046.png
PSU is 800W.

My reason for using this system is that I got it with a 3080 included for only $700. I sold the 3080 and upgraded to a 3090 because my primary use case is running LLMs. It would be nice to have stable Pal World on this computer too! The 3080 never crashed but the VRAM was inadequate for larger LLMs.

Only weird thing I noticed in GPU-Z (though I have zero clue what I'm looking at, mostly) is Resizable BAR is disabled. HP tard-proofed the BIOS, so there's no available option whatsoever to turn on Resizable BAR. Don't know if that would make any real difference though?

1722921387161.png
 
That kind of makes me realise how better the whole hobby would be if we had a standard CPU socket
Honestly if one group or the other went out of the way to publish specs and pave the way for it, it could happen

Better odds AMD would actually succeed at that imo
 
Honestly if one group or the other went out of the way to publish specs and pave the way for it, it could happen

Better odds AMD would actually succeed at that imo
it would be hard for intel or AMD to make the switch as it's sort of a prisoner's dilemma problem where if you make the open platform then people might buy the opponent's cpus to use on your motherboards.

I could possibly see ARM doing so, if they ever make socketed cpus
 
Intel would find a way to scum up a standardized socket. Not happening. Why would they want to use a platform that people could easily use a competitors chip on instead when they inevitably screw people?

Instead they can hold people captive long enough to sweep issues under the rug so that by the time the next product comes out, people can go right back to parroting "but Intel just works!".
 
Intel would find a way to scum up a standardized socket. Not happening. Why would they want to use a platform that people could easily use a competitors chip on instead when they inevitably screw people?
If AMD gets a big upper hand on market share they could throw intel a lifeline by letting them onto their chipsets, and buy control over desktop CPU socket and chipset design for the forseeable future in the process

If intel fixes everything with the next release probably not, but if they keep shooting themselves in the foot seems like that could pretty easily happen
 
Call me a mark, but I'll likely buy intel again, but that's more cause of the socket I'm currently on and my current cpu cooler.

That kind of makes me realise how better the whole hobby would be if we had a standard CPU socket
This is the situation I'm in. My motherboard technically can be upgraded to 14th gen. I'm not planning on ditching my PC soon. If in a few years they fix all the issues with 13th and 14th gen, I could see myself doing a core swap to keep up to date. That's a BIG if though. Might just stay with what I have for my PC's life cycle if nothing gets better.

As for new builds? Ask me in 10 years when i build a new rig. AMD is looking VERY good right now. Always a chance they do what Intel are doing rn in 10 years. Not opposed to AMD, I've built plenty of Ryzen rigs, they've ran great. I just need to know when I'm building a new PC, which is not soon.
 
If AMD gets a big upper hand on market share
I don't see this happening. AMD has been providing top tier gaming performance in a better package for less $ ever since the 5800X3D and gamers still to this day majority buy hot, inefficient Intel.

Now some of this is probably due to lack of AMD choices in OEM/laptops, which Intel will probably just buy their way into like they have before.

I'll eat my words if intel sales actually suffer from this fiasco, but I have my doubts. Intel has a lot of simps and the general consumer has the memory of a gold fish.

(Unless it's AMD, or more like ATI, drivers. Lmao)
 
Open-Source AMD GPU Implementation Of CUDA "ZLUDA" Has Been Taken Down (archive)

booOOOooo
A lot of the design happens in simulation tools like IC Validator. But since it's all so interconnected, and laws of physics govern the entire chip (voltage & current limits and the liked), yeah, you can't just change one module that easily.
I think this is improving with things like Intel's "tiled" design. For example, they can swap out an iGPU tile for a newer one or keep the old one if the newer one isn't ready. This supposedly happened with Intel choosing to use a less ambitious iGPU tile in Meteor Lake-H (128 EUs max instead of 192 EUs). I might be reading what you mean by "module" wrong since I'm skimming the thread.
Pretty sure AMD found an excuse to wait to allow intel to dig its own grave deeper and possibly get 13/14 gen kicked off of some of the chart comparisons for the upcoming launch
As tempting as that idea is, I don't think they are playing any 4D Go with their delays.

What I thought might happen is that the new microcode patch would land, hurt 13th/14th gen performance slightly, making Zen 5 desktop look better in the charts. I don't think the timing is right for that to happen because reviewers need days/week to test everything. The "final" patch should come mid-August but it might even be delayed.

I'm not sure who would kick 13th/14th gen off the charts completely for the upcoming reviews. Maybe Gamers Nexus? Would be funny.
I've just learned about the whole intel fiasco, I got a i7 14700k earlier this year, how fucked am I ?
If I had one, I wouldn't use it until the new patch(es) drop. Meaning I would turn off the computer completely and keep it off until Intel releases a microcode patch it's confident in. You do have multiple computers you can use, right?
Can't wait to see how many people are gonna line up to consoom next Intel product after all of this "I'm not going to buy Intel again, fuck em" because "but I've only ever bought Intel".
I doubt that 13th/14th gen issues are going to mean that Arrow Lake will be poisoned. Although it's worth waiting a little since it will be Intel's first chiplet/tile-based CPU for desktop after so many generations of monolithic.

My interest in Intel is only in picking up the refurbs/sales years later, so I will let the world beta test them for me.
 
I think this is improving with things like Intel's "tiled" design. For example, they can swap out an iGPU tile for a newer one or keep the old one if the newer one isn't ready. This supposedly happened with Intel choosing to use a less ambitious iGPU tile in Meteor Lake-H (128 EUs max instead of 192 EUs). I might be reading what you mean by "module" wrong since I'm skimming the thread.

More modular designs help, but it's still not trivial. Everything has to be validated. And if you want a feature change, like "what if we increased the L2 cache and removed some stages in the branch predictor," yeah, those decisions are locked in well before production.

I doubt that 13th/14th gen issues are going to mean that Arrow Lake will be poisoned. Although it's worth waiting a little since it will be Intel's first chiplet/tile-based CPU for desktop after so many generations of monolithic.

It's not just the bug, it's the fact that Intel obviously took shortcuts in QA to rush this dogpile out to market. The fact that Alderon & Level1Tech could both do their own stress tests and find a massive failure rate means Intel wasn't doing them. This then raises further questions about what state Intel's QA is in. It's ironic, since one of the main arguments Intel has been against AMD to OEMs has been quality, and SPR got delayed to iron out QA issues.
 
That's the thing. It's not just "kay, these chips were bad. The next ones will be fine...hopefully". It's what lead to this shit happening in the first place, and then the absolutely abysmal response to it. Like....people really want to go forward with that?
 
  • Like
Reactions: Brain Problems
Back