Just Some Other Guy
kiwifarms.net
- Joined
- Mar 25, 2018
Can't wait to see how many people are gonna line up to consoom next Intel product after all of this "I'm not going to buy Intel again, fuck em" because "but I've only ever bought Intel".
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Sadly this may be me when the new shit drops. Except I never said i'd never buy intel again. I just don't want to buy a new AMD compatible waterblock.Can't wait to see how many people are gonna line up to consoom next Intel product after all of this "I'm not going to buy Intel again, fuck em" because "but I've only ever bought Intel".
I mean, I might consider them next time I'm up for building a PC. But that probably won't be for half a decade or so. And with how things are going with ARM taking off, who knows if I'll even be building an x86 system next time I need to put together a desktop.Can't wait to see how many people are gonna line up to consoom next Intel product after all of this "I'm not going to buy Intel again, fuck em" because "but I've only ever bought Intel".
That's different. I just wish people would stick to their guns for once lol.Sadly this may be me when the new shit drops. Except I never said i'd never buy intel again. I just don't want to buy a new AMD compatible waterblock.
![]()
Call me a mark, but I'll likely buy intel again, but that's more cause of the socket I'm currently on and my current cpu cooler.Can't wait to see how many people are gonna line up to consoom next Intel product after all of this "I'm not going to buy Intel again, fuck em" because "but I've only ever bought Intel".
Sounds like maybe the PSU can't cope? Can you give the actual model number of the machine? It looks like these typically have 800W PSUs which is technically meets the Nvidia recommendations but still kinda borderline for such a high end GPU. A load table would be useful as if it has a wimpy 12V rail then it may be inadequate. Also check fitment of the 12VHPWR adapter. Everything must fit snug and with with no tight bends placing stress on the connectors.Ryzen Master is STIR-FLIED ASS HOE! I tried using AMD's "Curve Optimizer" and some other shit and then I ran Geekbench 6: View attachment 6277120
Top result is no OC. Bottom result is with the Ryzen Master "Auto OC."
In other news, I just installed a 3090 into that same HP Omen. I had to use a hammer to destroy the hard drive bays in order for the 3090 to fit, but it mostly works like a dream. Very powerful for running LM Studio, which is why I got it: almost more winning than I can handle.
HOWEVER, it has now crashed twice during Pal World. My laptop 4070 never crashes during Pal World, so why is my desktop 3090 shittier? I'm running the "automatic tuning" in the GeForce Experience app, will that help?
I attached a System Information screenshot.Sounds like maybe the PSU can't cope? Can you give the actual model number of the machine? It looks like these typically have 800W PSUs which is technically meets the Nvidia recommendations but still kinda borderline for such a high end GPU. A load table would be useful as if it has a wimpy 12V rail then it may be inadequate. Also check fitment of the 12VHPWR adapter. Everything must fit snug and with with no tight bends placing stress on the connectors.
It looks like it also has a 120mm AIO for the CPU which is retarded too. I'm not a fan of pre-built systems from HP, Dell, etc. not many "enthusiasts" rate them well and I would suggest you consider building your next system yourself or getting something from a smaller system builder that just uses off the shelf components.
Have you re-installed Windows on the system since you received it? The HP systems come loaded to the nth degree with crapware and just cleaning that shit out should help. As for OCing, I would check into whether the CPU could benefit from undervolting. The higher end 5000 series Ryzens (like the 5800X) certainly do so that might help it boost higher. Look into PBO2.
It's so over for Intcels.View attachment 6273512
The CEO of Intel is praying on Twitter.
Honestly if one group or the other went out of the way to publish specs and pave the way for it, it could happenThat kind of makes me realise how better the whole hobby would be if we had a standard CPU socket
it would be hard for intel or AMD to make the switch as it's sort of a prisoner's dilemma problem where if you make the open platform then people might buy the opponent's cpus to use on your motherboards.Honestly if one group or the other went out of the way to publish specs and pave the way for it, it could happen
Better odds AMD would actually succeed at that imo
If AMD gets a big upper hand on market share they could throw intel a lifeline by letting them onto their chipsets, and buy control over desktop CPU socket and chipset design for the forseeable future in the processIntel would find a way to scum up a standardized socket. Not happening. Why would they want to use a platform that people could easily use a competitors chip on instead when they inevitably screw people?
This is the situation I'm in. My motherboard technically can be upgraded to 14th gen. I'm not planning on ditching my PC soon. If in a few years they fix all the issues with 13th and 14th gen, I could see myself doing a core swap to keep up to date. That's a BIG if though. Might just stay with what I have for my PC's life cycle if nothing gets better.Call me a mark, but I'll likely buy intel again, but that's more cause of the socket I'm currently on and my current cpu cooler.
That kind of makes me realise how better the whole hobby would be if we had a standard CPU socket
I don't see this happening. AMD has been providing top tier gaming performance in a better package for less $ ever since the 5800X3D and gamers still to this day majority buy hot, inefficient Intel.If AMD gets a big upper hand on market share
I think this is improving with things like Intel's "tiled" design. For example, they can swap out an iGPU tile for a newer one or keep the old one if the newer one isn't ready. This supposedly happened with Intel choosing to use a less ambitious iGPU tile in Meteor Lake-H (128 EUs max instead of 192 EUs). I might be reading what you mean by "module" wrong since I'm skimming the thread.A lot of the design happens in simulation tools like IC Validator. But since it's all so interconnected, and laws of physics govern the entire chip (voltage & current limits and the liked), yeah, you can't just change one module that easily.
As tempting as that idea is, I don't think they are playing any 4D Go with their delays.Pretty sure AMD found an excuse to wait to allow intel to dig its own grave deeper and possibly get 13/14 gen kicked off of some of the chart comparisons for the upcoming launch
If I had one, I wouldn't use it until the new patch(es) drop. Meaning I would turn off the computer completely and keep it off until Intel releases a microcode patch it's confident in. You do have multiple computers you can use, right?I've just learned about the whole intel fiasco, I got a i7 14700k earlier this year, how fucked am I ?
I doubt that 13th/14th gen issues are going to mean that Arrow Lake will be poisoned. Although it's worth waiting a little since it will be Intel's first chiplet/tile-based CPU for desktop after so many generations of monolithic.Can't wait to see how many people are gonna line up to consoom next Intel product after all of this "I'm not going to buy Intel again, fuck em" because "but I've only ever bought Intel".
I think this is improving with things like Intel's "tiled" design. For example, they can swap out an iGPU tile for a newer one or keep the old one if the newer one isn't ready. This supposedly happened with Intel choosing to use a less ambitious iGPU tile in Meteor Lake-H (128 EUs max instead of 192 EUs). I might be reading what you mean by "module" wrong since I'm skimming the thread.
I doubt that 13th/14th gen issues are going to mean that Arrow Lake will be poisoned. Although it's worth waiting a little since it will be Intel's first chiplet/tile-based CPU for desktop after so many generations of monolithic.