GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

What I want to know is how on earth is ANYONE counterfeiting 14th gen intel chips? Aren't the produced solely in intel owned and controlled foundries? Couldn't this be a sign of a bigger problem where intel's secrets are getting stolen?
If there are counterfeits then they'll be legitimate Intel processors that have been in some way faked to appear higher end than they are. In the 90s there were a lot of counterfeit Pentium CPUs where the speed printed on the lid was fake. Since in those days you set your CPU speed by configuring jumpers on the motherboard, users of these fakes were basically overclocking them to the speed they saw in the lid and sometimes it worked, sometimes it was an unstable mess.

How things have changed! Now even a genuine Intel product can be an unstable mess when running at the listed specs!

Edit: Went to read that Reddit post and they claim the products were "re-marked" (basically the above mentioned scam?) and that one was a tray processor. What the fuck. A tray processor should have the same warranty anyway, it's the same bloody product just different packaging and it wasn't a tray processor anyway! Fuck Intel, these guys are scumbags. Of course after the guy gets 5k updoots on Reddit, Intel does a "second review" and approves his warranty claims.
 
Last edited:
Debating on getting this monitor
My monitor appears to be the same but an older model, S2721DGF. Here's my thoughts:
  • The monitor stand appears to be the same one they use for the UltraSharp monitors, just with a bit of gamer plastic on it. I haven't tried swapping in one of my UltraSharp stands but the mechanism looks and feels identical. This is a good thing, the UltraSharp monitors have good ergonomics that operate smoothly and shit all over most gamer monitors.
  • Some dickhead implemented "Display Port Deep Sleep" which puts the monitor into such a deep standby with no detected input that it is no longer considered plugged in. This caused me bizarre stability issues on Linux and Windows 10 (11 is fine) will move all windows to the primary display. There is no way to turn this off, only workaround I have is setting a blank screensaver so the monitor is never truly off. Absolutely retarded.
  • Display itself is clear and sharp, the anti glare is not overdone and I've had these for nearly a year, literally running 24x7 (thanks DP Deep Sleep) and no issues or weirdness to speak of.
I bought a similar LG Ultragear 27GP850-B prior to the Dells and ended up selling it to my brother after getting one of these Dells as a sample (ordered 2 more afterwards, all on sale price). The LG monitor has a flimsy as fuck stand, external power brick and the same DP Deep Sleep bullshit going on. The Dell monitor on sale basically shits all over LG's retarded bullshit.
 
  • Informative
Reactions: WelperHelper99
There's no real point in releasing an upgraded product line if your only competitor is effectively out of business in the short term.
There is if you're targeting a market that values having the latest stuff and is willing to upgrade just because it's better than what they have. And the home-builder PC market is such. In such a market you have incentive to release new products even if you have no competitors. If your local drug dealer was the only one in the area do you think he'd stop selling drugs?

That aside, there's little good reason to manufacture two separate generations of chips alongside each other barring a transitory period, so if you're choosing between never progressing and progressing, you're going to progress for a variety of reasons, including the aforementioned upgrade market; because lead in time for improvement is long and if you DO need to become competitive again you can't just suddenly decide to be at a month's notice - it takes years to develop new iterations; because Intel isn't AMD's only competition. They have both Apple and now Snapdragon which was far more power efficient than their last generation and now viable for a lot of use cases; and because you have to use talent to retain talent. If AMD decided "we're done innovating" they're not going to pay high salaries for their best and brightest to do nothing. Those people would leave and go to competitors or other careers and they'd be hard to get back.
 
I've heard something from a former AMD engineer, dunno how accurate it is, describing the turnaround time for CPUs.

'If you want something to be in next year's processor, suggest it three years ago.'
And that's not because it's like the government where it's just inefficient. It's because these are mind-bogglingly complex projects with very intricate inter-dependencies between the different parts. You absolutely MUST have your ducks in a row early on. I don't work in chip manufacture but I've led parts of some very complex software projects and even there you have to get things nailed down very early to avoid big delays. With hardware I imagine you can have some tolerances along the lines of "we can probably achieve X performance for the IMC but it might be as low as Y" that other teams can plan around and refine requirements as you go, but it's not like a small software app where you can say "oh, we decided we wanted to add this function, you guys are going to have to change your parts". Not when you're designing silicon and getting a test sample is a major multi-company project in itself.
 
And that's not because it's like the government where it's just inefficient. It's because these are mind-bogglingly complex projects with very intricate inter-dependencies between the different parts.

Three years to go from concept to production for a physical object is really fast. In the auto industry, concept to production is 6-8 years. In aircraft, it's even longer.

but it's not like a small software app where you can say "oh, we decided we wanted to add this function, you guys are going to have to change your parts". Not when you're designing silicon and getting a test sample is a major multi-company project in itself.

A lot of the design happens in simulation tools like IC Validator. But since it's all so interconnected, and laws of physics govern the entire chip (voltage & current limits and the liked), yeah, you can't just change one module that easily.
 
There is if you're targeting a market that values having the latest stuff and is willing to upgrade just because it's better than what they have. And the home-builder PC market is such. In such a market you have incentive to release new products even if you have no competitors. If your local drug dealer was the only one in the area do you think he'd stop selling drugs?

That aside, there's little good reason to manufacture two separate generations of chips alongside each other barring a transitory period, so if you're choosing between never progressing and progressing, you're going to progress for a variety of reasons, including the aforementioned upgrade market; because lead in time for improvement is long and if you DO need to become competitive again you can't just suddenly decide to be at a month's notice - it takes years to develop new iterations; because Intel isn't AMD's only competition. They have both Apple and now Snapdragon which was far more power efficient than their last generation and now viable for a lot of use cases; and because you have to use talent to retain talent. If AMD decided "we're done innovating" they're not going to pay high salaries for their best and brightest to do nothing. Those people would leave and go to competitors or other careers and they'd be hard to get back.
Retail and enterprise clients are a small slice in current year. From what I was told, datacenters are buying up entire lots. Those customers aren't going to wait, they need chips in blades next week.

And AMD only needs to pause shipments of new product for a few months, and sell current product that is cheaper to manufacture. I said short term, not years out.
I've heard something from a former AMD engineer, dunno how accurate it is, describing the turnaround time for CPUs.

'If you want something to be in next year's processor, suggest it three years ago.'
Chips take around 3-4 years for design to tape out [to the fab]. Tape out also doesn't mean everything's good-to-go, there's always a possibility of fabrication defects. After the design is sent to the fab, product segmentation, marketing, sales begin. Total process is usually 5 years.

Building a new fab takes even longer.
 
  • Like
Reactions: Brain Problems
Retail and enterprise clients are a small slice in current year. From what I was told, datacenters are buying up entire lots. Those customers aren't going to wait, they need chips in blades next week.

If you were told that datacenters are buying up all the desktop CPUs, you were told wrong. The "desktop CPU" market is technically both laptop and desktop SKUs, and IIRC that market is around 50-50. Maybe 60-40. Regardless of what the exact split is, laptop CPUs aren't being gobbled up by datacenters. I highly doubt that a large share of the actual desktop CPUs are being bought to power servers in datacenters, either.
 
Retail and enterprise clients are a small slice in current year. From what I was told, datacenters are buying up entire lots. Those customers aren't going to wait, they need chips in blades next week.
Datacentres are enterprise customers and one of the biggest things things they care about is power efficiency - you know what AMD's new generation bring? Large gains in efficiency. I don't really see what you're arguing. Your statement seemed to be that there is no reason for AMD to release new chips right now given the dire state of Intel. You've been given multiple reasons why it is in AMD's interests to do so and it feels like you're just arguing to be right.


And AMD only needs to pause shipments of new product for a few months, and sell current product that is cheaper to manufacture. I said short term, not years out.
What makes AMD need to pause shipments? They're ready to go now. And I guarantee they would not be doing so if it didn't make them money. You're arguing they should prolong current generation as it's "cheaper to manufacture". A lot of the cost is front-loaded, R&D, testing, taping out the new designs. When you've invested a huge amount up front in a new product you want to recoup that investment. If they can charge more for the new line (and they are), if the new line has increased reasons for people to buy it (which there is - AVX-512, big leap forward in power-efficiency as a start), if there is a competitor that will otherwise gain market share at there expense without it (Snapdragon had a very significant battery life advantage in laptops and is perfectly fine for a very large portion of the market), or an existing competitor that had chips which were very superior in some ways (Apple) then that's good reasons for it to not sit around on their investment.

I'm pretty sure the bean counters at AMD have crunched the numbers very thoroughly and have access to a lot of data you don't. AMD are a for-profit company and they know what they're doing. You can disagree with me if you like but I'll live with it - I'm not going to type out the same arguments a third time. Take it or leave it.
 
somehow i think AMD will become more common in laptops.
They don't have the volume for it, and I get the impression OEMs would rather go with Qualcomm's utter garbage before they switch over to AMD, even though zen5 looks to beat everyone in performance and are behind only Apple in efficiency (and the difference there is getting slim, which is remarkable considering how deeply Apple Silicon has been tailored for efficiency. Apple are barely legacy compatible with themselves, while AMD retain legacy compatibility all the way back to the 8086. In theory you could solder an EPYC 9755 into the original IBM PC and boot it into BASIC.
 
  • Thunk-Provoking
Reactions: Brain Problems
It is a category error to think a microcode bug in a desktop CPU gives AMD a significant additional advantage in the datacenter. That is not to in any way discount how much market share EPYC has taken away from Xeon, or how Instinct is starting to whittle away at NVIDIA, I'm just saying this bug is not terribly relevant to the datacenter business.

AMD's datacenter edge has large come from Intel being behind TSMC, nothing more and nothing less. They sell out every Zen 2, 3, and 4 chip they can make. Intel might be hitting parity this year, though. Sierra Forest and Granite Rapids are both Intel 3 chips. We'll see how they stack up against Zen 5 soon enough.
 
  • Like
Reactions: Brain Problems
Pretty sure AMD found an excuse to wait to allow intel to dig its own grave deeper and possibly get 13/14 gen kicked off of some of the chart comparisons for the upcoming launch

In fairness they are broken garbage CPUs, and if they wanted to avoid that they shouldn't have sold bad batches from the production issues 'identified in late 2022' or whatever their latest claim was. Completely nuts that they supposedly identified the issue and yet sold millions of bad chips anyhow and as far as anyone knows are still doing so
 
  • Like
Reactions: Brain Problems
If you were told that datacenters are buying up all the desktop CPUs, you were told wrong. The "desktop CPU" market is technically both laptop and desktop SKUs, and IIRC that market is around 50-50. Maybe 60-40. Regardless of what the exact split is, laptop CPUs aren't being gobbled up by datacenters. I highly doubt that a large share of the actual desktop CPUs are being bought to power servers in datacenters, either.
I was told entire lots (I presume he meant crates) of chips were bought by cloud providers, and smaller datacenters (I presume he meant an independent firm in a single buidling) have used desktop CPUs in order to fulfill their demand in time. It's always possible he was wrong, I have reason to trust him.
I don't really see what you're arguing. Your statement seemed to be that there is no reason for AMD to release new chips right now given the dire state of Intel. You've been given multiple reasons why it is in AMD's interests to do so and it feels like you're just arguing to be right.
You can disagree with me if you like but I'll live with it - I'm not going to type out the same arguments a third time. Take it or leave it.
I do not understand the animosity here, I haven't attacked anything you wrote. It looks like the launch is on for this month, so I'm likely to be proven incorrect anyways.
 
  • Like
Reactions: Brain Problems
I was told entire lots (I presume he meant crates) of chips were bought by cloud providers, and smaller datacenters (I presume he meant an independent firm in a single buidling) have used desktop CPUs in order to fulfill their demand in time. It's always possible he was wrong, I have reason to trust him.

A crate is nothing. Intel moves 50 million desktop & notebook CPUs in a quarter. There are around 3 million servers shipped in that amount of time. Even if Xeon & EPYC sales fell to zero, the server market isn't big enough to absorb even 10% of just Intel's desktop shipments.
 
  • Informative
Reactions: Brain Problems
If you notice your temps getting wonky its good to check on it. Some stuff used to dry/pump out. Less common nowadays.

If temps are fine, leave it alone.
I think its fine then. I replaced my GPUs grease for the first time in 8+ years and its much better and was thinking since I got left over grease should I give the CPU a fresh coat. Better not since I remember having issues with the CPUs backplate having issues. reminds me to check my temps tho. brb.
 
Back