GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

I got an i7-4790 on LGA 1150 about a week ago. Real old at this point, but still plenty good for most tasks and for $25 along with the RAM it seemed like a no-brainer.
That's what I use as my living room media pc hooked up to my TV in one of those old dell optiplexes. Pretty good CPU still considering it's age.

Recently got a RTX A2000 to put in it, annoyingly in the dell sff cases the 16x PCIe slot is next to the PSU and won't fit a dual slot card. Was considering a riser cable but ended up just putting it in the 4x slot. The performance loss is not as much as I thought it would be to be honest, and it works fine.
 
As an experiment, I'm disabling hyperthreading and turbo boost on my i9-12900 to see how it goes. This limits the P-cores to 2.4 GHz and the E-cores to 1.8 GHz. I want to see how much being able to run hot actually affects my user experience.
 
Sure you have cores so multithreaded workloads will be just fine, but what I meant is that interactivity may suffer.
Edit. When doing web browsing without playing videos most cores on my CPU are parked at 400 MHz.

@Leaded Gasoline And here we are with AMD spanking Intel in gaming with X3D. It didn't look optimistic for AMD before Ryzen.
 
Last edited:
Aren't you basically nerfing yourself to Sandybridge era performance.
AnandTech 6700K review comparing IPC from Sandy bridge to Skylake
more like a skylake refresh is my guess

1708220508310.png
 
AMD fanboys are much worse and practically inescapable outside of forums exclusive to Intel/NVIDIA owners
AMD cultists have always been the least helpful niggers ever.
>never used AMD cards since late 2000's
>buy old 6600xt during pandemic because it was the only card available
>drivers and display keep fucking up
>ask for assistance
>"its your powersupply"
>"it's your motherboard"
>"it's your RAM"
>"it's your CPU (unless it's an AMD cpu, of course)"
>"it's literally anything but the GPU"
>call the gpu shit
>"DO NOT BE MEAN TO RED COMPANY"

Say what you will about nvidia, but at least their cards actually work.
 
AMD cultists have always been the least helpful niggers ever.
>ask for assistance regarding why my old 6600xt's drivers were crashing and it wouldn't show a display unless I unplugged and plugged back in the HDMI cables
>"its your powersupply"
>"it's your motherboard"
>"it's your RAM"
>"it's your CPU (unless it's an AMD cpu, of course)"
>"it's literally anything but the GPU"
>call the gpu shit
>"DO NOT BE MEAN TO RED COMPANY"

Say what you will about nvidia, but at least their cards actually work.
For years I've been hearing about how much nvidia sucks on Linux, so this time I got a high end AMD card.... I should have gotten NVIDIA. Audio issues, stupid driver weirdness, poor AI support(which I expected at least).

It does seem to game ok in Windows at least. Not sure what setting I missed though as it's apparently not detecting HDR.
 
  • Informative
Reactions: Fagnacious D
For years I've been hearing about how much nvidia sucks on Linux, so this time I got a high end AMD card.... I should have gotten NVIDIA. Audio issues, stupid driver weirdness, poor AI support(which I expected at least).

It does seem to game ok in Windows at least. Not sure what setting I missed though as it's apparently not detecting HDR.
I actually was using Windows 10 LTSC, and every single time I booted my computer the card refused to show a display until I reached around and unplugged the hdmi cables then plugged them back in.
When the pandemic ended and prices started coming down, I sold that shit at a loss and just bought a 3070 instead.

AMD should have just left ATI alone. IIRC they were doing alright before the buyout and I don't remember my ancient HD 3870's having any of these issues.
 
AMD should have just left ATI alone. To my knowledge they were doing alright before the buyout and I don't remember my ancient HD 3870's having any of these issues.
early GCN was stable, had good value and often matched NVIDIA in gaming, it was only after polaris that things started to go to shit

i wonder why
1708225001733.png
 
AnandTech 6700K review comparing IPC from Sandy bridge to Skylake
more like a skylake refresh is my guess


Alder Lake's Gracemont cores have an IPC comparable to Skylake, while the Golden Cove cores are about 20% higher. Of course, I also turned off HT. Basically I am wondering a few things. One is just how much clock speed really matters - how often is the clock spinning up to do tasks that don't really matter? Another is to what degree increased parallelism compensates for low clock speed.

I've already noticed right away that there's been no noticeable effect on web browsing. This is not too surprising, since the lowest-end Alder Lake CPUs are quad-core potatoes. One of the most CPU intensive games I play, Darktide, is definitely affected.
 
  • Thunk-Provoking
Reactions: jeff7989
AMD cultists have always been the least helpful niggers ever.
>never used AMD cards since late 2000's
>buy old 6600xt during pandemic because it was the only card available
>drivers and display keep fucking up
>ask for assistance
>"its your powersupply"
>"it's your motherboard"
>"it's your RAM"
>"it's your CPU (unless it's an AMD cpu, of course)"
>"it's literally anything but the GPU"
>call the gpu shit
>"DO NOT BE MEAN TO RED COMPANY"

Say what you will about nvidia, but at least their cards actually work.
For years I've been hearing about how much nvidia sucks on Linux, so this time I got a high end AMD card.... I should have gotten NVIDIA. Audio issues, stupid driver weirdness, poor AI support(which I expected at least).

It does seem to game ok in Windows at least. Not sure what setting I missed though as it's apparently not detecting HDR.
Meanwhile I’ve never had a driver problem with AMD cards, but inevitably do with Nvidia cards. Clearly Team Green are incompetent bastards who need to just get out of the consumer GPU market to focus on AI hardware.
 
  • Feels
Reactions: ZMOT
Alder Lake's Gracemont cores have an IPC comparable to Skylake, while the Golden Cove cores are about 20% higher. Of course, I also turned off HT. Basically I am wondering a few things. One is just how much clock speed really matters - how often is the clock spinning up to do tasks that don't really matter? Another is to what degree increased parallelism compensates for low clock speed.
Evidence mounts that Intel will ditch hyperthreading in upcoming Arrow Lake and Lunar Lake products. May as well get used to it early.

Intel's next-gen Arrow Lake CPUs might come without hyperthreaded cores — leak points to 24 CPU cores, DDR5-6400 support, and a new 800-series chipset
Next-gen Intel Arrow Lake-S CPU spotted with 24-threads and no AVX-512 functionality
Intel Lunar Lake CPU leak shows 8-Core & 8-Thread config, cache sizes, and 2.8 GHz Boost

Also a new space heater called the 14900KS:

Intel Core i9-14900KS CPU With 6.2 GHz Clocks Listed By French Retailer For €768
 
Last edited:
  • Informative
Reactions: The Ugly One
Evidence mounts that Intel will ditch hyperthreading in upcoming Arrow Lake and Lunar Lake products. May as well get used to it early.

HT's benefit is pretty spotty. On some workloads, it actually makes things worse. Perhaps they've decided that using those transistors for more L3 cache or a bigger NPU is a much better use of the die space. Or maybe just cut the chip size and increase yields.
 
HT's benefit is pretty spotty. On some workloads, it actually makes things worse. Perhaps they've decided that using those transistors for more L3 cache or a bigger NPU is a much better use of the die space. Or maybe just cut the chip size and increase yields.
They will supposedly switch to a new technique referred to as "rentable units" that will break workloads down and spread them out over multiple P-cores and E-cores more effectively. But it may not be ready in time for Arrow Lake despite it being planned to debut then. So we should continue to see HT missing from these engineering sample leaks, because it was removed from the design and it's too late to put it back.

Future Intel CPUs May Dump Hyper-Threading For Partitioned Thread Scheduling

Arrow Lake may get a large increase in L2 cache, 3 MiB per P-core rumored, from 2 MiB in Raptor Lake. Possibly even "Adamantine" L4 cache stacked under the CPU tile, although Meteor Lake was supposed to have that and didn't.
 
The ones I've tried are Diablo IV, Darktide, and MWII, and with the highest texture setting, they have sporadic juddering and shuddering. It's pretty clear that 384 GB/s is not enough bandwidth for 12 GB to even be useful.
That's a shame. Is there a way to pre-load textures?

I'm stuck in the never ending loop of waiting for a newer/better waste of money for a gpu. Was considering the 4080Super but I'm not super sold plus its a fuckin grand
Spend the money, even if it's on a "budget" gpu. I could be wrong on this as I'm about a year out of date, but there's no "bad" gpus at the mid range right now, and given the stagnation I'm guessing a GPU will last quite a while (a friend of mine is still using a 1070 he got pre-pandemic and seems perfectly happy). You'll get your £300 worth long before it's obsolete.

For a high end GPU like a 4080, I can see the hesitation. That's why I think you should go mid range. That way if you "waste" your money, you're only wasting a third or less.


In this thread's humble opinion, is there a point to VR right now or is it just a big toy/gimmick? Every now and then I'm tempted due to like, one or two things, but overall it seems expensive and like you need a fairly good dedicated area to do it versus my cramped home office.
VR is definitely still in the gimmick range now. Gaming is still just these short games you play for 30 minutes and then don't touch again. Porn has a long way to go to really be viable beyond a cool thing to try. Other than driving games, it's just not there yet.
I don't own VR yet. But I second what @ZMOT and others said.

But I want to expand a bit on what he said about expectations. If you were expecting the holodeck, or if you were expecting a completely transformative experience, then you might be disappointed. I don't know what people were expecting when they looked at games like Half-Life Alyx and H3VR dismiss them as "just FPS games" or get mad and say they should be flatscreen games.

Then there's games that are popular as VR games but fall flat as pancake games like Ultrawings and Derail Valley. And it's like, yeah, of course they'll fall flat or fail to capture the same feel. It would be the same as using motion controls to play Mario World.


Then there's the games. Things have moved on since the crowbcat video in 2016, though VR is still in it's infancy and is propped up by indie devs who don't have the refined tools of pancake.

Again, it's about expectations. If you're looking for big franchise AAA releases, you're going to be SOL. If you're willing to dive into indies and AA, then you get stuff like Into The Radius 1+2, Vertigo 1+2, Compound, The Living Remain, Ultrawings 1+2, VTOL, Derail Valley, plus various mods like most modern Resident Evils and some Unreal engine games that are playable in VR with various degrees of jank.


For me, the big problem with VR is there's always some major compromise you have to make depending on which headset you choose. Deckard might be the first complete package, but I'm expecting an astronomical price tag.
 
  • Like
Reactions: U 'MIRIN BRAH?
Back