GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
Was he in the wrong?
No. Pat Gelsinger did exactly what was needed to Intel to survive which is to stop spending money on bullshit and focus on core technologies.

Unfortunately this process is not fast, cheap, or easy and shareholders are pussies so they replaced him with a Chinaman. And that Chinaman will proceed to... do exactly the same thing as Pat. Except he's sufficiently different from Pat to satisfy shareholders' superstitions.
 
1750254472446.webp
Was he in the wrong?
He was right about most things, including delaying SPR (which I believe he's holding up in that picture). Thanks to him, Intel is finally competitive again in the server space with Granite Rapids, although the harm done by just rehashing Skylake over and over under Krzanich will take some time to fix. Now that 18A is just about production-ready, Lip Bu's stepping in to take credit for everything.

Pat G made three legitimate mistakes, and I can't say the board was entirely wrong to let him go. One is that, being an engineer, he vastly underestimated how difficult selling a foundry service would be. It took me a long time to learn that having good technology is only 15% of the puzzle, and I wasn't entrusted with billions of dollars to piss away to learn that lesson.

The second is that he clearly did not have eyes on the QA/QC problems at Intel. These problems were mounting before he took over. Intel lost Apple primarily because Krzanich prioritized hiring brown people and women over making chips that didn't fail. The 13th gen/14th gen issue is an entirely unnecessary mistake that would have been avoided, apparently, if somebody had just run some servers for a week straight under load before going to market.

He's also behind pissing a lot of money away on gaming GPUs that, as we have seen with both the Alchemist and Battlemage series, are reasonably well-designed hardware that has suffered from embarrassing software-related issues that should have been caught by a well-designed test. And I say that as somebody who's had a lot of responsibility to test products.

But on the other hand, Lip Bu really seems to have no ideas other than fire a lot of people, sell off critical business units, and take credit for the good things Pat did. Oh, and I'm sure he'll hire a ton more pajeets & chinks, since that's what he did at Cadence.
 
Maybe the texture neural compression thing will strike 8 GB gold.
VRAM-friendly neural texture compression inches closer to reality — enthusiast shows massive compression benefits with Nvidia and Intel demos (archive)
Nvidia's demo shows the benefits of NTC for VRAM usage. Uncompressed, the textures for the flight helmet in this demo occupy 272 MB. Block compression reduces that to 98 MB, but NTC provides a further dramatic reduction to 11.37 MB.

OMG AMD, you have to save the 9060 XT 8 GB!

Enthusiast hacks FSR 4 onto RX 7000 series GPU without official AMD support, returns better quality but slightly lower fps than FSR 3.1
 
The B580 seems to be heavily CPU-constrained on Ryzens and pre-Alder Lake intels. Nobody seems to have figured out what exactly is the issue. Even pretty good 11th gen Intels shit the bed, but the i5-12400 seems to do just fine. So either the most powerful Ryzen you can afford, or a recent Intel.
Isn't it an issue with resizable BAR, which isn't supported in the older systems? I got a b570 hoping to use it as an eGPU, but thunderbolt doesn't support rebar regardless of the BIOS setting, and performance was totally fucked. After I fried my nuc trying to rig up an nvme connection, I built a core ultra 5 245k and it works pretty well.
 
Isn't it an issue with resizable BAR, which isn't supported in the older systems? I got a b570 hoping to use it as an eGPU, but thunderbolt doesn't support rebar regardless of the BIOS setting, and performance was totally fucked. After I fried my nuc trying to rig up an nvme connection, I built a core ultra 5 245k and it works pretty well.
It's not a reBAR issue alone. It runs like total ass with reBAR off, but even with it on, the CPU still has a pronounced performance impact. From what I have been able to find online, recent intel CPUs don't have the issue (or at least it's not as bad), so your Core Ultra is fine. I am not a semiconductor engineer, so my best guess is there is something unique in how Intel implemented PCIe from 12th gen onward that Battlemage is unintentially dependent on.
 
It's not a reBAR issue alone. It runs like total ass with reBAR off, but even with it on, the CPU still has a pronounced performance impact. From what I have been able to find online, recent intel CPUs don't have the issue (or at least it's not as bad), so your Core Ultra is fine. I am not a semiconductor engineer, so my best guess is there is something unique in how Intel implemented PCIe from 12th gen onward that Battlemage is unintentially dependent on.
It may be more a cpu issue with 12th gen having different cpu id's for e and p cores and that messes a lot with older games. Rebar however is not gonna give you 50% games. It's in area of 5-10% on a good day. I would doubt that this setting is said person's issue
 
  • Feels
Reactions: Fcret
It makes me sad how completely dead and raped the corpse of Dennard scaling is. We were supposed to be at like 1 THz by now. (:_(
Since I had already accepted it was all ogre, I've been pleasantly surprised by the rise of 4.5 GHz base clocks and ~6 GHz turbo.

TSMC's 2nm N2 process node enters production this year, A16 and N2P arriving next year
gainz.webp

Unless I'm being extra retarded, we can see ~7 GHz on GAAFET nodes like TSMC N2X (rumored Zen 6 desktop node). By the way, these "X" nodes like N4X, N3X, and N2X are specifically optimized to squeeze out more performance while tolerating higher voltages. Hopefully that doesn't go wrong.

I'm more worried about transistor density and the transition to 3D everywhere than clocks. I think TSMC has been putting out conservative estimates and then revising them up later, but the gains seem minimal for its transition to GAAFETs. SRAM scaling has not completely died at 5nm, but moving L3 cache onto 3D packages using an older node could soon become cheaper than including it on a base die.

I also want to see the transition to 3D DRAM so we can move from $1.25/GB to $0.10/GB.
 
It may be more a cpu issue with 12th gen having different cpu id's for e and p cores and that messes a lot with older games. Rebar however is not gonna give you 50% games. It's in area of 5-10% on a good day. I would doubt that this setting is said person's issue
Interesting. My NUC was a 12th gen and performed horribly with a modern game with the b570 and no rebar, but worked ok with the a770M. Too bad I fried it before I could investigate more.

I did test the same game on the new pc without rebar enabled, and it was fine, so there may be something to that. I haven't done much performance testing on p vs e core - just trust that the thread director is doing its job.

In any case, I get the feeling Intel doesn't care much about graphics or gaming performance anymore unless it uses AI.
 
  • Feels
Reactions: Dawdler
Something's way off with those systems. I get like 1% gain on a good day that gets instantly negated by sporadic crash in halo due to nvidia's shitty drivers with rebar on. That difference is almost as if the card ran in 2D mode.
 
In any case, I get the feeling Intel doesn't care much about graphics or gaming performance anymore unless it uses AI.
Intel cares about graphics performance insomuch as they want to compete in the APU space with AMD. They don't really care about the tippy-top benchmark charts but they would like to be able to move volume and doing that in 2025 requires that your integrated graphics not be completely dogshit. I imagine that's a major reason why their dGPU stuff is still going on - it's effectively R&D for future APUs semi-subsidized via dGPU sales to enthusiasts.
 
Something's way off with those systems. I get like 1% gain on a good day that gets instantly negated by sporadic crash in halo due to nvidia's shitty drivers with rebar on. That difference is almost as if the card ran in 2D mode.
We're talking about an issue unique to Intel GPUs.
 
Back