GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

AMD getting absolutely dumpstered in sales. Consumers really want DLSS, especially in these low-end and midrange cards, and AMD really got caught with its pants down on FSR not really being competitive with DLSS. So Nvidia doesn't really need to compete on price as much as PC Gaming Youtube seems to think it does.
At least the chart shows RX 7000 taking them from 12% to 19%.
GPU-Add-in-Board-Market-Share-2002-to-Q4-2023.png
 
  • Informative
Reactions: Brain Problems
At least the chart shows RX 7000 taking them from 12% to 19%.
View attachment 5795105

It shows AMD taking a giant dump ahead of the 40 series arriving, and the 7000 series so far not managing to get back to even close where they were before.

What is interesting here is that the GeForce 40 series, which has been pretty widely panned by the hardware YouTubers, has bounced dGPU sales back to the halcyon days of, uh, well, 2020, moving 7.6 million units in Q4 2023. So dGPUs still haven't broken out of their secular decline, but to hear GamersNexus and that zit-faced kid go on about, NVIDIA is desperate and failing because the 4060's memory bus isn't wide enough or something. In reality, they've had their best quarter since Q1 2022.

1709870201433.png

Question is, is this bounce back from free-fall temporary or sustainable?

1709870332608.png
 
but to hear GamersNexus and that zit-faced kid go on about, NVIDIA is desperate and failing because the 4060's memory bus isn't wide enough or something
PC youtube spent half a decade showing off converted office PCs but is shocked that one of the few competent cards of this gen that can fit in one is actually a sleeper success. That certainly tracks.

Vex is young and retarded. He doesn't really have the perspective of some of these other dudes so I can kinda excuse him. The Moore's Law is Dead podcast with him was a bit tiresome to listen to though.

Question is, is this bounce back from free-fall temporary or sustainable?
I lean more towards the latter only because I feel like the console industry is kind of in the shitter right now (even as Steam breaks records). Kids these days don't seem to think consoles are cool and that's a pretty big cultural shift from when I was a kid, so who knows what things are going to look like going forward.
 
It shows AMD taking a giant dump ahead of the 40 series arriving, and the 7000 series so far not managing to get back to even close where they were before.

What is interesting here is that the GeForce 40 series, which has been pretty widely panned by the hardware YouTubers, has bounced dGPU sales back to the halcyon days of, uh, well, 2020, moving 7.6 million units in Q4 2023. So dGPUs still haven't broken out of their secular decline, but to hear GamersNexus and that zit-faced kid go on about, NVIDIA is desperate and failing because the 4060's memory bus isn't wide enough or something. In reality, they've had their best quarter since Q1 2022.

View attachment 5795245

Question is, is this bounce back from free-fall temporary or sustainable?

View attachment 5795257
Well I pulled the trigger on a $500 4060 Ti 16 GB. Yes, the 4070 is ((only)) $40 more but I refuse to spend more on a video card than a fucking console. This is for a new build so I don’t even have an old card to limp around on, but no doubt the prices aren’t going to get any better. The market is going to shift to lower volume, more enthusiast builds, and AI (for profit) which means price hikes.

Literally buying entry-level for the price of a PS5, holy Jesus. What are ‘normal’ people doing? Oh, they’re not buying shit. Only retards like me.
 
  • Lunacy
Reactions: SargonF00t
Well I pulled the trigger on a $500 4060 Ti 16 GB. Yes, the 4070 is ((only)) $40 more but I refuse to spend more on a video card than a fucking console. This is for a new build so I don’t even have an old card to limp around on, but no doubt the prices aren’t going to get any better. The market is going to shift to lower volume, more enthusiast builds, and AI (for profit) which means price hikes.

Literally buying entry-level for the price of a PS5, holy Jesus. What are ‘normal’ people doing? Oh, they’re not buying shit. Only retards like me.
Most gamers aren't buying even lower level new GPUs. Just look at the steam hardware chart, over 50% of it is fairly old 1050/60s, 1650/60s, 2060s, and the 3050/60. They are considering things like sub $200 3050s, and that is if they are buying new.

What is interesting here is that the GeForce 40 series, which has been pretty widely panned by the hardware YouTubers, has bounced dGPU sales back to the halcyon days of, uh, well, 2020, moving 7.6 million units in Q4 2023. So dGPUs still haven't broken out of their secular decline, but to hear GamersNexus and that zit-faced kid go on about, NVIDIA is desperate and failing because the 4060's memory bus isn't wide enough or something. In reality, they've had their best quarter since Q1 2022.

Question is, is this bounce back from free-fall temporary or sustainable?
If their financial reports are anything to go by, they have been flat in gaming revenue since the previous quarter, but they have been extremely good at increasing their margin by over 10% over the year. Not that this sector matters to them anymore since gaming revenue has gone from roughly 1/2 of their data center revenue a year ago to less than 1/5 now.
 
What are ‘normal’ people doing? Oh, they’re not buying shit. Only retards like me.

I go used for just about everything, but haven't bothered to upgrade really anything since covid and even then, I'm rocking AM4 and a 1660 Ti. Game-wise, there's nothing really interesting to warrant an actual upgrade.

Granted, the retard in me is always looking to potentially upgrade because why not, but if I do, it'll be to a 3060 or a 6700xt. Used, of course.
 
Most gamers aren't buying even lower level new GPUs. Just look at the steam hardware chart, over 50% of it is fairly old 1050/60s, 1650/60s, 2060s, and the 3050/60. They are considering things like sub $200 3050s, and that is if they are buying new.


God bless those 'Oliver Twist' children of the Pandemic.

The resistance and latent anger over pandemic shortages and cost will keep the minimum specs rock bottom for another generation to come. Shortages or no shortages or fake shortages. "Please sir, may I have another 3060".
 
I go used for just about everything, but haven't bothered to upgrade really anything since covid and even then, I'm rocking AM4 and a 1660 Ti. Game-wise, there's nothing really interesting to warrant an actual upgrade.

Granted, the retard in me is always looking to potentially upgrade because why not, but if I do, it'll be to a 3060 or a 6700xt. Used, of course.
Yep. As Nvidia drags GPU prices ever higher, more people are just gonna go cheap used or just check out of PC gaming.

Thankfully most new games suck anyways.
 
Wendell benchmarking a new 128-core 12-memory channel CPU. This is a super interesting piece of hardware for servers and highest-end workstation, so naturally he tests it with… Windows and Counterstrike. Some interesting flaws in windows’ scheduler are discovered and explained.
 
Wendell benchmarking a new 128-core 12-memory channel CPU. This is a super interesting piece of hardware for servers and highest-end workstation, so naturally he tests it with… Windows and Counterstrike. Some interesting flaws in windows’ scheduler are discovered and explained.
We could discover a futuristic alien supercomputer and these dudes would still try to run Windows on it. I'm routinely shocked by how little knowledge these 'PC gurus' of youtube have about anything that isn't some GUI program for Windows.
 
We could discover a futuristic alien supercomputer and these dudes would still try to run Windows on it. I'm routinely shocked by how little knowledge these 'PC gurus' of youtube have about anything that isn't some GUI program for Windows.
Wendell installs Windows to demonstrate why that's not a good idea with a CPU like this, he gets CPU-bound running Counterstrike at 50fps which is impressively bad, laptops average like 400 fps in that game. Unlike someone like Linus, who would install Windows completely seriously and then be shocked when it turns out server hardware and consumer operating system are not a great combination, Wendell actually knows what he's talking about and chose this ridiculous configuration to make a point. I can't think of any tech youtubers other than Wendell and Ian Cutress who would even know what NUMA is, let alone be able to speak about it without a prepared script. Intel will be pushing up against Windows' hard limits with their consumer processors shortly, so demonstrating the issue they're about to have with an AMD server CPU makes sense.
 
Wendell benchmarking a new 128-core 12-memory channel CPU. This is a super interesting piece of hardware for servers and highest-end workstation, so naturally he tests it with… Windows and Counterstrike. Some interesting flaws in windows’ scheduler are discovered and explained.

He mentions Comsol & Ansys at the end, so he does know what he's doing. And yes, I've benchmarked MKL and AOCL on AMD CPUs...and MKL was faster.
 
Last edited:
He should benchmark OpenFOAM on it, I hate everybody and everything forever.
Tell him in the comments. It’s still an unlisted video, there’s a good chance he’ll reply. Or you can ask him to bench your workload on the forums, he’s usually very approachable about that stuff.
 
Tell him in the comments. It’s still an unlisted video, there’s a good chance he’ll reply. Or you can ask him to bench your workload on the forums, he’s usually very approachable about that stuff.

I edited before you posted this, should have read the whole thread and watched the video, but I have the attention span of a gnat.
 
Well I pulled the trigger on a $500 4060 Ti 16 GB. Yes, the 4070 is ((only)) $40 more but I refuse to spend more on a video card than a fucking console. This is for a new build so I don’t even have an old card to limp around on, but no doubt the prices aren’t going to get any better. The market is going to shift to lower volume, more enthusiast builds, and AI (for profit) which means price hikes.

Literally buying entry-level for the price of a PS5, holy Jesus. What are ‘normal’ people doing? Oh, they’re not buying shit. Only retards like me.
I'm still running my 2070 non super that I got for $400 a few years ago. 8gb hits the wall in Blender sometimes when particles are used, but nothing that isn't a show stopper.
 
  • Informative
Reactions: seri0us
@The Ugly One what’s your take on single-thread vs. threaded performance? Obviously we have tasks like compilation and rendering that make really good use of threads, but I’m thinking more about media work: Adobe shit and audio applications demand single thread performance. Audio especially seems to really be pinned down to single-thread workloads. I think a lot of productivity stuff like Adobe relies on continued improvements to CPUs. They’ve never been forced to actually develop fast code like what happened to video game developers with the recent consoles.

There are a tremendous amount of terribly written application and driver code that just throws everything on a single thread (or a few, poorly optimized). Often this is specialized stuff bottlenecked by some slowass I/O shit anyway, but I think it expresses relevance that (for general purpose computing), single-thread is king.
 
@The Ugly One what’s your take on single-thread vs. threaded performance? Obviously we have tasks like compilation and rendering that make really good use of threads, but I’m thinking more about media work: Adobe shit and audio applications demand single thread performance. Audio especially seems to really be pinned down to single-thread workloads. I think a lot of productivity stuff like Adobe relies on continued improvements to CPUs. They’ve never been forced to actually develop fast code like what happened to video game developers with the recent consoles.

There are a tremendous amount of terribly written application and driver code that just throws everything on a single thread (or a few, poorly optimized). Often this is specialized stuff bottlenecked by some slowass I/O shit anyway, but I think it expresses relevance that (for general purpose computing), single-thread is king.

Please no, this is my worst autism trigger.

Okay, so, if my choice is a 2 GHz core or a pair of 1 GHz cores, I always want the 2 GHz core. The reason is simple, it's hard to break up tasks into homogeneous chunks that can be parceled out to workers operating in parallel. This isn't even unique to computers, it's a fundamental difficulty in manufacturing at scale as well.

However, the reality is that, due to the breakdown in Dennard scaling about 18 years ago, single cores aren't getting a lot faster. Furthermore, power consumption increases like the square of clock speed - a 5 GHz core consumes 4x the power of a 2.5 GHz core. Unfortunately, there are a lot of Luddites and outright retards gatekeeping performance-critical code bases (source: my depressing career), and these dinosaurs have done an excellent job of ensuring that consumers are not able to enjoy the performance gains of AVX-2 and many-core architectures.

Intel and AMD have unpaid consultants that are willing to review your code and hold your fucking hand to max out CPU performance, and the answer they get most often from big ISVs is to get the door slammed in their faces. Look, I know a guy who, just a couple years ago, got the go-ahead to fix low-level performance in an old code base, we're talking at a billion dollar company, and he got 10x-100x speedups across the board merely by doing Software 101 shit to their data structures. They're now cleaning up and destroying their competition, but what's fucking sad is they could have been doing this for 15 years. It took getting a younger guy in charge of the division who told all the old programmers that if they didn't like his protege touching their code, he would fire them (he did fire a couple people who bitched). That's how incredibly bad commercial software actually is, and it's 100% driven by principal engineer butthurt when you tell them their 1998 architecture is outdated. I mean I could personally probably dig into whatever software you're bitching about and get a massive speedup, but what would happen in reality is I'd get fired because I hurt the fee-fees of some 55-year-old whose entire self-image is based on telling himself he's the best programmer alive.

At a recent job, I left a 2x speedup unmerged because my boss went absolutely apoplectic. Like, red-faced and shouting. He was furious for no reason other than it meant retiring his code and replacing it with a FOSS library that is about 3x faster than his (the part I would have replaced would have been 3x faster, for 2x overall application speedup).

GPUs are forcing the issue, though. A workstation GPU has 2x-10x the performance of the highest-end CPU, depending on workload, and this performance is so obvious that senior developers are no longer able to bullshit their way out of having to do real work (reminder that senior devs are the laziest motherfuckers on the planet, marketing people work harder than they do). See, in the past, management could be convinced with enough impenetrable technofog that multicore, SIMD, etc, weren't really good technologies, and really, our software is so perfect that Intel just can't keep up. But now, everyone's watching NVIDIA, and the order's coming down, get your shit on GPU or work somewhere else, faggot. And guess what, running on GPU means threading your code. In fact, GPU threading is even more restrictive than CPU threading, meaning that your job is even harder if you've been avoiding threads for the past 20 years, but cry me a fuckin' river, better yet, kill yourself.

Right, where was I? So yes, if you want something that has across-the-board great performance, you are going to need both very fast single-core performance and a lot of cores (not to mention fast storage and a lot of memory bandwidth). That's why, unlike server-based EPYCs, workstation-oriented Threadripper PRO clocks up to 5.1 GHz. You're going to have some dogshit software you interact with during your workday that can't profitably make use of 8 cores, let alone 64, so you need that clock speed.
 
Now that I'll be working I've been thinking about my next computer setup and and I wanted your input about which direction to go. I sold my main gaming computer (Rx 5700 level) for some much needed cash a whole back and have been using an intel 6th gen Lenovo X1 yoga that's really struggling at times. I right now have a work iMac on my desk that I put a laptop in front of. So far I narrowed down my options to four that may not be 100% but what I think would be a solution:

1. Buy a used mini PC for $50 and swap it for the computer by the tv, a MSI Trident 3 with Nvidia 1660 graphics. This would be the cheapest option, but would still require some equipment like a KVM and figuring how I'm going to use the iMac as a display or to put a monitor in front of it.

2. A 14" ultraslim laptop - ideally a ThinkPad - that has a 12th gen Intel CPU or equivalent AMD with a very powerful iGPU. This would give me a very flexible system with a similar setup to what I have now, but I wonder if iGPU tech is good enough to run games like Baldur's Gate 3.

3: a 16" gaming laptop - ideally a ThinkPad or legion or other brand known for quality hardware - that has the specs guaranteed to run games I want at full speed, with Baldur's Gate 3 being the most recent and power intensive games that I would like to play. This probably wouldn't be as portable as I'd like.

4. A handheld PC like the Legion Go. This could be set up as a desktop with a dock so I'd need the KVM, but it would be very portable allowing for couch gaming and such on the go.

No matter what I would like to stay under $1,000. I'm willing to buy used as long as I can be certain that the used computer is in like new condition and everything works properly.
 
Back