GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Just be mindful that the LGA 1700 motherboards come in two RAM form factor types -- DDR4 or DDR5. They aren't interchangeable, so you're going to have to pick a flavor and stick with it.
Thanks for the tip, ive already picked DDR4. Nothing I do comes even remotely close to using the bandwidth of DDR5.
 
  • Like
Reactions: bifftango
Turns out, I was wrong :( AMD won't be taking the efficiency crown, this generation, at least not with chiplets. AiBs may end up delivering the "3GHz" that AMD promised, TPU OC'd an ASUS card above 3GHz and it turned out to be interesting:
View attachment 4064166

A deeper pipeline, double the shader engines, shared resources between shaders, "built for 3GHz"...

:thinking:
Im not surprised to see this results and you will probably will see other partners pulling shit like this because AMD is less ass than Nvidia now in what they can do with the boards, there is a reason for why EVGA pulled the plug and i dont blame them ,if i was our taiwanese overlords i would be shaking hands with EVGA right now because they asspulled some weird performance tricks out of nowhere

Wouldn't doubt it. I'm getting pretty tired of this shit. You can just tell they're itching to say "If you ain't got $1600 for a gpu, go fuck yourself".
Thats what they get for appealing to Crypto miners
 
I forgot to mention this as well. Being on the supported hardware list is also the thing that pushes Quadro sales.

Apart from that, things like 10 bit color support is also limited to Pro GPUs.
Yeah, driver certification is a big thing.

Nvidia begrudgingly enabled 10bpp a couple of years back because of OLED/HDR so it was no longer locked to Quadro. AMD did the same.
photoshopradeon.JPG
Took this screenshot a couple of seconds ago in the Photoshop settings. I can't take a screenshot of the other menu because the Radeon Pro control panel have aborted itself again because windows updated the drivers against my will... Time to uninstall, restart, install, restart, maybe restart again if the sound is gone.
 
  • Informative
Reactions: JoseRaulChupacabra
I got a Lenovo Thinkpad X1 Tablet (Gen 2) because I'm apparently a hoarder who can't stop himself.

A few short thoughts - the famed Thinkpad build quality is really not what it once was but then again this still is not bad at all. Kaby Lake has a big advantage over older intel systems like this, that advantage is support for vp9/h265 10 bit decoding in hardware. (Even if 10 bit support isn't really a thing in Linux, encoding to 10 bit is what you'll encounter more with h265 as it's more efficient) 3:2 is a really nice format for a screen if you are a terminal dweller and this is a really nice screen with good contrast and a subjectively very good black level for an IPS. I really like the feature that you can fully control battery charging with thinkpads in linux. This will really help your battery live longer if you often run it connected to AC. If you read reviews online the opinions are mixed which I think is mainly because Windows is no Android let alone iOS and for a fat OS like Windows a dual core Kaby Lake was already kinda weak back then, will peg the CPU and will suck the battery dry in no time. See it as a Pi-esque computer built into a small (but not that small) but high quality touchscreen&battery which you can use with whatever keyboard you prefer and it suddenly looks a lot more useful. Lenovo contrary to Fujitsu didn't bother with RAPL limits so the CPU will just boost until it runs into temperature limitations and the case gets scalding hot. If limited to reasonable values it manages to go consistently a few hundred Mhz higher than the Fujitsu at roughly the same average temperature. Is doing the better burst benchmark really worth it delivering a computer to your customer which always feels like it's overheating (a possibly handheld tablet at that)? You decide. Otherwise both computers consume about the same with the Thinkpad more power-efficent in video decoding and *maybe* GPU tasks.

So interestingly, this Thinkpad can do something the Fujitsu can't for reasons I couldn't figure out (probably either firmware or internal USB hub related) namely enter Pkg-C10. I actually had problems measuring the power consumption at this point accurately because it consumes so incredibly little in that state, but it's about ~1-1.5W, if I'm generous. With about 10-19 wakeups a second. Yes, the screen is off and the Thinkpad keyboard (which is very hungry for a keyboard, and also incredibly flimsy) disconnected but this is still a computer that is "on", with running daemons and reachable via WiFi and mosh SSH session. There's desktop x86 who don't consume that little in S3. For x86 that is very impressive and there's basically no reason to turn this computer off when connected to AC. Since you get modern video decoding, a nice screen, fanless operation and low power consumption, it's a perfect "always-on" terminal/housekeeping computer which you could even stream other desktops in your house to.
 
Last edited:
A few short thoughts - the famed Thinkpad build quality is really not what it once was
That's been my experience too. Had an x230 which was old when I got it and it was an absolute trooper. Got a new T495s a while back and its been plague with issues. One of the most annoying is the keyboard not working properly right out of the factory (allegedly one of the things Thinkpads are still supposed to be good for). The enter and backspace keys intermittently refused to work, and it seems I'm not the only one because a lot of people online have the same issue and the only response from Lenovo seems to be to send in their underpaid pajeets to the support forum to tell people to reinstall their drivers or some shit. Thanks for nothing.

My experience of their recent machines massively soured me on the brand and don't think I'll be buying another Lenovo machine in the future.
 
My experience of their recent machines massively soured me on the brand and don't think I'll be buying another Lenovo machine in the future.
A couple generations ago, Thinkpads had a fatal flaw where the Thunderbolt chip would kill itself and permanently brick your USB ports until you replaced the motherboard. It could be prevented by a firmware update, sure, but that was my tipping point where I swore off Thinkpads forever. Oh, how the mighty have fallen.
 
Last edited:
Thanks. You both know a lot more about this than I do.

I feel like this situation must change because there are so many AMD cards now and they are so capable in every other way (except RayTracing in a relative sense) that I feel support for AI/ML work must grow. Ditto 3D rendering software. I'm therefore tempted to gamble on the 7900 as I know it is "good enough" for now and I think will become more level with Nvidia over time. Though I've been wrong before.

Off to read some reviews and try to ferret out some pricing / availability information.
Keep in mind that when I talk about stuff, my main reference point is use cases that burn more core-hours in a day than you will in a year. That's a context where making the wrong choice in hardware results in millions of dollars wasted.

The reality than I am starting to learn, primarily through this thread, is that for consumer workloads, virtually anything you could think to buy is fine. Home PCs are ridiculous now.
CAD software companies probably drive a lot of Quadro adoption as well. There's one of the big players I know of where if you call them for support, their first question is what kind of graphics card you're using. If you say GeForce or Radeon, they immediately hang up on you.

That never stopped the engineering shop I used to work at many years ago, you just had to know how to fix your own problems.

Nicest thing about the Quadros is the memory. CAD models can have asinine amounts of geometry, and the renderers are generally written by retards who would never be able to hack it in the game industry (lots of sub-pixel polygons with no LOD, handrolled effects on CPU that DX9+ can handle with two fucking lines of code, etc), so put it together, and you need the biggest video card money can buy to get your work done.

The problem with raytracing is two fold, one the devs have to actually care to implement it and second you are wasting resources in fancy lights that only a hand of rich people (and consoles but eh) can use, so trying to keep whining about muh raytracing when in a year only AAA games give enough of a crap to implement it, even Linus claimed that it was a not issue because even he doesnt use or particularly care about that issue

AMD should just be ironing those drivers and complete their AI thingy implementation just to shit the bed of nvidia more, there are theories in the air that Nvidia will just cut the VRAM of the 4080 to ship it as the 4070 and call it a day

The biggest problem with raytracing is that rasterization engines have gotten so feature-rich that the qualitative difference just isn't that big a deal. Back in the 00s, when all a rasterizer could handle was some Gouraud shading, gamma-space lighting, some low-res shadow maps, and maybe an emboss map here and there, raytraced graphics looked mind-blowing. Now, the difference is more like, "Oh nice, the reflections are perspective-correct."
 
A couple generations ago, Thinkpads had a fatal flaw where the Thunderbolt chip would kill itself and permanently brick your USB ports until you replaced the motherboard. It could be prevented by a firmware update, sure, but that was my tipping point where I swore off Thinkpads forever. Oh, how the mighty have fallen.
Customers sometimes send me laptops to do work on. Had a Thinkpad last year just stop doing HDMI over the USB-C/TB port, nothing seemed to fix it. Normal USB-A worked fine. Same customer, new generation of Thinkpad just stopped charging over one of the USB-C ports, but the other one works fine. Garbage.
 
  • Like
Reactions: PC LOAD LETTER
The biggest problem with raytracing is that rasterization engines have gotten so feature-rich that the qualitative difference just isn't that big a deal. Back in the 00s, when all a rasterizer could handle was some Gouraud shading, gamma-space lighting, some low-res shadow maps, and maybe an emboss map here and there, raytraced graphics looked mind-blowing. Now, the difference is more like, "Oh nice, the reflections are perspective-correct."
This is the answer, or rather, one of the answers, among many.

I'll also add that what gets called ray tracing today is like a 5 year old making stick drawings and calling that a painting. I mean sure, the kid may legitimately have potential, and paint on paper is technically a painting, but we're really lowering the bar there.
 
  • Like
Reactions: Cpt. Stud Beefpile
This is the answer, or rather, one of the answers, among many.

I'll also add that what gets called ray tracing today is like a 5 year old making stick drawings and calling that a painting. I mean sure, the kid may legitimately have potential, and paint on paper is technically a painting, but we're really lowering the bar there.
Years ago, I saw some forum posts by a game dev opining that raytracing was not all that, and never would be, because no matter how nice your raytracing is, you can get a lot more visual quality per clock cycle via rasterization.

The Intel Core i3-N305 should be enough for any non-gamer, non-professional. Maybe even the Intel N95 coofputer.

Given that a lot of games are still being made for 10-year-old hardware, it might even be capable of that.

1671225081278.png

I know that my i9-12900, which is 25% slower than the i9-12900K, but uses only about half the energy, never goes above about 15% utilization during any game. Makes me wonder if it's really necessary to be blasting as much power through chips as they are these days. Since the PS4/XB1 forced everyone to learn how to write multithreaded code, I wonder if 16 E-cores couldn't be a compelling gaming CPU.
 
Years ago, I saw some forum posts by a game dev opining that raytracing was not all that, and never would be, because no matter how nice your raytracing is, you can get a lot more visual quality per clock cycle via rasterization.



Given that a lot of games are still being made for 10-year-old hardware, it might even be capable of that.

View attachment 4089546

I know that my i9-12900, which is 25% slower than the i9-12900K, but uses only about half the energy, never goes above about 15% utilization during any game. Makes me wonder if it's really necessary to be blasting as much power through chips as they are these days. Since the PS4/XB1 forced everyone to learn how to write multithreaded code, I wonder if 16 E-cores couldn't be a compelling gaming CPU.
Rather than the CPU performance of the N305, which should exceed some of the older big quad-cores, I'm thinking of the 32 execution unit Gen 12 iGPU. It should be identical to UHD 770 in your 12900, except it could be clocked lower.


Many newer games will not hit 30 FPS at 720p, but yeah, older or less demanding games could run fine.

Actually... you have everything you need to simulate the N305 in your i9-12900. Just disable all the P-cores, set new clock speed limits for the E-cores and iGPU once those are known, and use only the iGPU.
 
Last edited:
A 12th gen I3 can get you amazingly far in gaming unless you're into really intensive stuff. My decade-old CPU can handle Elite: Dangerous, it's the GPU and DDR3 that are in dire need of an upgrade.
You may have seen the benchmarks posted earlier, DDR5-4800 provided no better than 5% improvement over DDR-2133. Games simply aren't memory bound. It's like nobody's really figured out anything new to do with games in 10 years other than render at 4K.

Rather than the CPU performance of the N305, which should exceed some of the older big quad-cores, I'm thinking of the 32 execution unit Gen 12 iGPU. It should be identical to UHD 770 in your 12900, except it could be clocked lower.


Many newer games will not hit 30 FPS at 720p, but yeah, older or less demanding games could run fine.

Actually... you have everything you need to simulate the N305 in your i9-12900. Just disable all the P-cores, set new clock speed limits for the E-cores and iGPU once those are known, and use only the iGPU.

iGPU still seems to have little purpose beyond ancient games, but what I'm really intrigued by is how capable notebook GPUs like the 3050 are. I just got a gaming laptop with one of those, and it's running Marauders and Deep Rock Galactic at 1080p just fine. I turned some graphics down to keep the fans from going crazy, but frankly, graphics have gotten so good that I can't really tell the difference.
 
Last edited:
  • Informative
Reactions: Brain Problems
iGPU still seems to have little purpose beyond ancient games, but what I'm really intrigued by is how capable notebook GPUs like the 3050 are. I just got a gaming laptop with one of those, and it's running Marauders and Deep Rock Galactic at 1080p just fine. I turned some graphics down to keep the fans from going crazy, but frankly, graphics have gotten so good that I can't really tell the difference.
I wonder if there will ever come a time when one of the CPU chiplets will get replaced by a GPU die. I can imagine some problems would come with that, not the least of which would be power consumption and heat build up, but maybe I can finally justify my massive Noctua. And of course, cost and a sort of "lock in" though you should still be able to upgrade.
 
  • Like
Reactions: The Ghost of Kviv
iGPU still seems to have little purpose beyond ancient games, but what I'm really intrigued by is how capable notebook GPUs like the 3050 are. I just got a gaming laptop with one of those, and it's running Marauders and Deep Rock Galactic at 1080p just fine. I turned some graphics down to keep the fans from going crazy, but frankly, graphics have gotten so good that I can't really tell the difference.
I wonder if there will ever come a time when one of the CPU chiplets will get replaced by a GPU die. I can imagine some problems would come with that, not the least of which would be power consumption and heat build up, but maybe I can finally justify my massive Noctua. And of course, cost and a sort of "lock in" though you should still be able to upgrade.
AMD has been trying this ever since hUMA in 2014. It hasn't been as successful as they would like, but there have been distinct gains.

The Steam Deck is a good example of an integrated solution: middling CPU, large GPU, with shared memory. And rumours are AMD wants to push in this direction with future APUs. It also helps that the Series S is so weak, along with upscaling tech like FSR bridging the gap for casual gaming. The M1 nailed it out of the gate, but since it's :wow: Apple, anything beyond Vulkan 1.0 is going to be like flying to Mars.
 
Back