GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

just turn on these settings that may or may not work with the game you want to play

Which games don't support DLSS and can't hit 60 fps on a 4060?

Maybe not as in they prefer that, but there seems to be a growing number of people that don't care if what they're looking at is completely fake.

If you play video games, you don't care that what you're looking at is completely fake.

Isn’t the 4060 actually selling quite well?

It is. "Nobody" in this case means "nobody in the dozens of people bitching on reddit."
 
Last edited:
I'd like to know what the fuck this even means. We're packing more shit on GPUs than ever.
The 4060 is basically the next-gen 3050 die in terms of die size (and is even codenamed as such as the AD107 vs GA107 for the 3050). In spite of being half the die size of the 3060 its about the same on MSRP. In terms of realized performance its about the same as the 3060, losing to it in some cases due to the 4060s inferior memory bus. It beats 3060 if you can use the improved DLSS, otherwise it kindof sucks and you'd be better off getting the cheaper and older card.
 
The 4060 is basically the next-gen 3050 die in terms of die size (and is even codenamed as such as the AD107 vs GA107 for the 3050). In spite of being half the die size of the 3060 its about the same on MSRP. In terms of realized performance its about the same as the 3060, losing to it in some cases due to the 4060s inferior memory bus. It beats 3060 if you can use the improved DLSS, otherwise it kindof sucks and you'd be better off getting the cheaper and older card.
Yep, pretty much all launch 4000 cards (except the 4090) moved down a tier while keeping the price and using dlss3 to makeup for it.

It was actually a big deal. Then AMD came in and released an equally meh low-midrange selection that made discounted 6800s look like quite the catch.
 
Imagine setting such a lofty goal of 1080p/60fps for a $300 card.

Well, you were the one whining that DLSS can't be relied on since not all games support it, so I was just wondering which games you have in mind that (a) don't support DLSS and (b) can't run well on a 4060 at 1080p. What games did you have in mind?

It was actually a big deal.

No, it wasn't. Shrieking from the YouTube soyface community did not translate into lost sales. Much like the soyface community's outrage over the existence of DLSS didn't stop the vast majority of gamers from using it, their shrieking about the bus width on NVIDIA's entry-level card also isn't stopping the 4000 series from once again crushing the 7000 series in the marketplace. Turns out 288 GB/s is plenty for 1080p. And, then, after all that begging for AMD to save them...the 7600 also has 288 GB/s.

When senior engineers at competing companies come to the same conclusion, maybe it's them and not you who are right.
 
  • Like
Reactions: WelperHelper99
Maybe someone can enlighten me.

As far as I know framegen is always one AI frame every other frame. How about more options like every three or four frames? This would help a lot with artefacts and latency.

Image upscalers have a lot of flexibility to work on a high percentage of original info. 1080p->4K looks smeary? Try 1440 or 1800p internal until you're satisfied.

You can't make that kind of compromise with framegen, it's all or nothing.
 
As far as I know framegen is always one AI frame every other frame. How about more options like every three or four frames? This would help a lot with artefacts and latency.

Because then it would look like constant stutter. You'd be seeing 60 fps 75% of the time and 120 fps 25% of the time. The mistake people made with framegen early on is thinking they could use it to boost a shit frame rate to a good frame rate. What you should be doing is adjusting whatever settings you need to get to 60fps+, then FG for an extra boost. Think of it as an icing-on-the-cake feature, not a foundational feature.
 
  • Agree
Reactions: George Lucas
Maybe someone can enlighten me.

As far as I know framegen is always one AI frame every other frame. How about more options like every three or four frames? This would help a lot with artefacts and latency.

Image upscalers have a lot of flexibility to work on a high percentage of original info. 1080p->4K looks smeary? Try 1440 or 1800p internal until you're satisfied.

You can't make that kind of compromise with framegen, it's all or nothing.
Frame gen on a game running at 90fps will produce 180fps, which means if you're using a normal 120Hz monitor, you're losing out on potential frames. But because you're running the GPU at only 66% capacity, it's more likely to be able to overcome sudden spikes in load that otherwise would appear as a stutter. If you only generate an AI frame every third frame, you'd use more of the GPU's total capacity, but the closer you get to 100% utilization, the less buffer you have to overcome those spikes of load that cause the stutter.
no he's right. if you ask anyone if they'd be willing to pay for the GPU injecting fake frames, most would probably tell you to fuck off.
I did pay for it. Mostly I got the 4090 because I wanted the high performance AI stuff, but I'd be lying if I said DLSS3 wasn't a consideration. FSR has gotten a lot better, but it does add some very noticeable artifacts, which DLSS mostly avoids. As it is, I can basically always tell if FSR is on, but DLSS can go as low as 50% resolution and still produce acceptable output. That's pretty impressive, and because I'm not necessarily going to run the GPU at full power all the time, I get less stutter and heat. Those are genuine selling points.
Do frame-gen created frames contain artefacts? Absolutely. Can you notice in a game running at 120fps? Not even slightly. It's the same with upscaling. Does a game running at 1080p upscaled to 4k look perfect? Of course not, and it never will. But it does look good enough that, when things are actually moving around, you'd be hard pressed to find the artifacts. What I've found it does consistently struggle with is moving water, for instance all the waterfalls in BG3, which look kind of garbage if you examine them up close, but I think that's in large part just because it's still relatively new. Upscaling is still basically just tacked on to a game after the fact, because most players are still using GTX1080s and playing at 1080p anyway, so it's barely needed. It'll continue improving, especially considering just how much money it saves the manufacturer. Consumer GPUs are a waste of silicon when you look at the margins for datacentre stuff.
 
no he's right. if you ask anyone if they'd be willing to pay for the GPU injecting fake frames, most would probably tell you to fuck off.

You don't have to pay extra for DLSS3, any more than you have to pay extra for your GPU to paint fake geometry on flat polygons.

Consumer GPUs are a waste of silicon when you look at the margins for datacentre stuff.

Which is why AMD isn't saving anybody. Everyone hoping that with the 7000 series, AMD would sell you twice as much silicon for half as much money was sorely disappointed when they ended up subject to the exact same economics as NVIDIA.
 
Last edited:
Rumor shit, grain o salt


Zen 5C = 16 cores on 1 unified core complex (CCX), might have to do with "ladder L3 cache"
Zen 5D variant?
Zen 6 will have 8/16/32 core CCDs
Possibly 256 cores (8 * 32-core) for a Zen 6 Epyc variant, up from 192 Zen 5C cores

An 8-core Zen 6 CCD seems like an excuse to keep on selling 6-cores at the low-end. As for the 32-core CCD, maybe L3 cache is removed entirely and stacked over or under it, allowing for more cores on a relatively small chiplet.

AMD seems to be making things more complex with more products, and multiple types of cores and CCDs beyond just two (8-core Zen 4 and 16-core Zen 4C). For example, the Strix Halo mega APU is supposedly using Zen 5 chiplets, but a different chiplet than the desktop CPUs. What and why?

In other news, AMD "Sound Wave" may be an APU using ARM-based cores instead of x86/Zen 6 as previously rumored.
 
Last edited:

Indepth look at the M4 from Geekerwan. Chinese, but with English subtitles.

tl;dr, there's some architectural changes, but the CPU cores remain mostly the same, just clocked slightly higher.

They bumped up CPU/GPU core counts, put it in the iPad Pro giving that AV1 decode for the first time. They may have made the NPU faster or are playing a trick with INT8/INT16.

iPad with 8 GB RAM apparently has 12 GB:
M4 iPad Pros with 8GB of RAM may actually have 12GB — teardowns reveal possible Apple hijinks
Apple's goal may be to avoid the unfamiliar "12GB RAM" spec which may alienate some consumers; Cupertino is likely looking to avoid any possible drop in sales after a bad year for iPad in 2023.
lol.
 
I'm assuming it's binned 12GB chips where the other 4 was disabled for being broken or unstable.
 
I have all my PC parts now. i9 12700k, ASrock motherboard, Thermaltake 1000w power supply, MSI 4070 super, 3 M.2 drives totaling 4 tb, 1 500gb ssd boot drive, 64gb ddr4 3200 Crucial ram, a Hitachi DVD drive, water cooler aio, and a Cougar case with USB C ports. Have some monitors and a keyboard scavanged from a thrift store. OS is Windows 11. My reaction:
d01 (1).jpg
Assembly starts today. My body is ready.
 
Last edited:
I have all my PC parts now. i9 12700k, ASrock motherboard, Thermaltake 1000w power supply, MSI 4070 super, 3 M.2 drives totaling 4 tb, 1 500gb ssd boot drive, 64gb ddr4 3200 Crucial ram, a Hitachi DVD drive, water cooler aio, and a Cougar case with USB C ports. Have some monitors and a keyboard scavanged from a thrift store. OS is Windows 11. My reaction:
View attachment 6016976
Assembly starts today. My body is ready.
Use an M.2 as boot drive, not the SATA SSD (I’m assuming that’s what you meant when you listed them separately). Boot drive is the one where speed and latency matters most.
 
  • Agree
Reactions: SargonF00t
Use an M.2 as boot drive, not the SATA SSD (I’m assuming that’s what you meant when you listed them separately). Boot drive is the one where speed and latency matters most.
The sata is a WD Red. It's legitimately tougher than the m.2's. It's fast enough. I've already tested the set up on a rig I built for my friend, works great.
 
  • Lunacy
Reactions: SargonF00t
Back