GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

before I make a mistake, can anyone tell me if I'm stupid for going for a 5600g instead of a 5600x? Reason why is because I want something that can post in case my GPU fails and I don't have to wait days before using my computer again
You should be fine.

I don't know for sure (someone who knows better should correct me) since I've never had a APU, but I think you sacrifice a little performance for on board graphics. Unless you're doing major CPU heavy work loads all of the time, you shouldn't really notice the difference.

The main problem with the 5600g is that it's Vega graphics. ie. You're looking at low end 1080p gaming at best. ETAPrime uses them a lot in his small form factor builds and they are often in mini PCs.

Provided you know what you're getting into, it shouldn't be a problem. Unless there's driver problems or something I don't know about.
 
  • Like
Reactions: Smaug's Smokey Hole
Sold my Z170 board, RAM, i7-6700k, and built a PC for them. Gave the person a ton of parts I needed to get rid of as well. Got rid of a spare PSU and Corsair 120HD RGB lighting system.

My i5-12600k will be up and running later. Picked up 32gb of 3600mhz CL16 DDR4, and an MSI Z690 PRO-A. Couldn't justify spending $100 for 2 more P cores with little actual performance gain when gaming.

I will probably end up delidding the 12600k soon and installing a custom retention socket. I am not really interested in direct die cooling. Most of the blocks are from sketchy manufacturers with poor QC. Currently flushing the waterloop out and am switching to Mayhems Clear X1. Don't with opaque fluids and their maintenance.
 
  • Like
Reactions: Brain Problems
before I make a mistake, can anyone tell me if I'm stupid for going for a 5600g instead of a 5600x? Reason why is because I want something that can post in case my GPU fails and I don't have to wait days before using my computer again
PCIe is nerfed to 3.0, which can be a big problem if you use some crap 4-lane GPU like the 6500 XT. Other than that it's fine.

At least going forward we won't have this problem, since almost all Ryzens should have an iGPU on AM5.
 
before I make a mistake, can anyone tell me if I'm stupid for going for a 5600g instead of a 5600x? Reason why is because I want something that can post in case my GPU fails and I don't have to wait days before using my computer again
Hard to say without knowing more about your needs or the value of money to you. But general answer - in-built GPU is an advantage for the reasons you state. And even, if you're not a gamer, just to allow you to not need a graphics card in the first place and save a lot of power. So IF a 5600g meets your processing needs then go for it.

I will say that all of the upcoming AMD chips have an iGPU as standard so if this is a desirable feature to you and you're willing to wait six months, you might want to wait. You'll get the advantages of a newer chipset as well. More money though as you'll be springing for the newer processor and DDR5. Worth mentioning, though. In the future all consumer AMD CPUs will be APUs.

Anyway, even if you are a gamer it's the GPU that nearly always bottlenecks your gaming so my simplistic answer is that a 5600g should be fine unless you're going for an all-round high end system.
 
  • Informative
Reactions: Brain Problems
We might be getting some more supply issues. Seeing some articles saying Russia are restricting sale of some gases needed for microchip production, don't know how accurate this is but some articles are saying Russia supply something like 30% of materials needed for chip production. We are never getting pre coof prices and availability again are we? :mad:
 
The slow demise of Moore's Law has really been a boon for the longevity of desktop PCs. I used coronabux to upgrade (most of) my machine early last year, but in retrospect I barely noticed a difference in speed from my 2015 machine. Probably a full upgrade is something you only need to do every 6-7 years, and maybe even later than that.

We complain alot on these boards about various tech trends, but this has been a good one (although maybe you could argue innovation is stagnating instead). Younger KFers probably won't remember how in the 2000-2010 or so era you have to upgrade your machine every 2 years to stay up with what was current in gaming, I don't miss that at all.
 
I have a 4650G Pro and the internal GPU shares memory bandwidth with the CPUs, the main reason iGPUs won't ever be quite as fast as dedicated ones with their own VRAM. Every tiny increase of RAM speed you'll immediately notice in increased graphics performance, they can't get enough of it and there's no such thing as superfluous bandwidth. I have ECC RAM and if you want to go the ECC route, you need to have a Pro APU because they're the only ones supporting ECC with the iGPU enabled, the regular ones can do ECC in general, but not with the iGPU. ECC adds latency, for the record.

The graphics performance is.. actually pretty good, better than I expected. Pretty much all low-intensity indie games and older FPS like Alien: Isolation are no problem, as are games like Stellaris. I even played games like Eurotruck Simulator and Elite: Dangerous on it. I do have to add the disclaimer that I used to have really power hungry graphics cards and got used then to limit games to 30 FPS to save on heat and electricity (mostly heat) so I wouldn't be too sure you could get 60 FPS out of this GPU in all cases. If you limit it to half the "usual" framerate of 60, of course you basically get double the mileage of what reviews claim. The power consumption is also amazing for x86, gaming nominally tops of at 35-50W. With idle desktop work, the system often doesn't even go past 20W.

My plan is to get a graphics card eventually, dedicate the iGPU to my normal Linux desktop operation and use the graphics card with passhtrough plus a Windows VM and leave it off all the other times, and hopefully not have it consume much power then. I'd like to play some of the more graphics intensive games the iGPU simply isn't good enough for but I'm not too sure yet if I'll stick permanently to it as I happily lived without a dedicated graphics card for a while now.
 
I have a 4650G Pro and the internal GPU shares memory bandwidth with the CPUs, the main reason iGPUs won't ever be quite as fast as dedicated ones with their own VRAM. Every tiny increase of RAM speed you'll immediately notice in increased graphics performance, they can't get enough of it and there's no such thing as superfluous bandwidth.
DDR5 with its overclocking focus is going to help push the boundaries of integrated graphics, getting to above 10,000 MT/s eventually. But it would be nice to see a "super-APU" with big L3 (3D V-Cache/Infinity Cache) or L4 cache. They've shown no signs of wanting to do that and L3 for the APUs has typically been much lower (Renoir cores could only access 4 MB, vs. 16 MB for Matisse desktop). But with 3D stacking you could have your standard cheaper APUs and then a halo product that stacks more cache without needing to make an entirely new die.

If you want to see integrated graphics rival dedicated, just look at the Xbox Series X and PS5.
 
If you want to see integrated graphics rival dedicated, just look at the Xbox Series X and PS5.
Completely out of the loop there, but I'll take your word for it. I could very well imagine that iGPUs are the wave of the future. Practically, the one I already have is *almost* there for me. Not quite, but almost. That's pretty close.

Also if they end up being a standard feature and not an option, there'll be a big incentive to optimize games and game engines towards then. I wouldn't be surprised if dedicated graphics cards end up being an enthusiast niche/special interest/application scenario thing.

I've been looking at graphics cards and the GTX 1660 seems interesting to me. I don't plan on gaming a lot or make a bigger investment (especially in this market where I have to assume if I ever sell it again it'll be at a considerable loss) to see if many games still are for me, I also mostly am interested in playing stuff like RDR2, and even older stuff like Skyrim or FO4 which just doesn't do well with the Ryzen iGPU. (I tried) I like mostly about it that it seems to be very power- and temperature efficient, which is kind of a must with the Node 202 case. As I mentioned, I also run everything at 30 FPS. This card can be had for ~$180-$210 used with some careful shoppping. For people more into the matter, is that a good idea or rather nah? Remember - not a powergamer - 30 FPS - not interested in high framerates or a lot of the more popular games. The last graphics card I owned was an R9 390 and google tells me it's faster than that.
 
Corsair have only just released DDR5 6600 32-39-39-76 and will cost an absolute mint. Be at least another full year before DDR5 is close to sensible prices.
Yeah it is way too expensive right now and it will take time to mature. I was considering getting 4000mhz or so DDR4, but decided against it. Ended up getting Corsair Vengeance RGB PRO 3600mhz over any other brand. It integrates well with all the other iCUE compatible stuff that I have. I'll have to see about trying to hit 3800+ and tighten timings.

When I cleaned my whole loop 2 weeks ago I accidently put my top rad fans on exhaust. I was wondering why temps were higher, but attributed it to ambient temp increases. Now I am back to my usual configuration which is 7 total intakes and passive exhaust. I have one Noctua 120 on the case floor and 6x Corsair ML120s split between my two 360mm radiators.

I'm all up and running with the 12600k. The MSI Z690 is pretty nice. I like how straightforward overclocking is. I had grown partial to ASUS BIOS setup and features. Only bios I ever hated was ASRock and shitty Gigabyte boards. So far I haven't really fucked around with OC settings that much yet, just set XMP.
 
Hard to say without knowing more about your needs or the value of money to you. But general answer - in-built GPU is an advantage for the reasons you state. And even, if you're not a gamer, just to allow you to not need a graphics card in the first place and save a lot of power. So IF a 5600g meets your processing needs then go for it.

I will say that all of the upcoming AMD chips have an iGPU as standard so if this is a desirable feature to you and you're willing to wait six months, you might want to wait. You'll get the advantages of a newer chipset as well. More money though as you'll be springing for the newer processor and DDR5. Worth mentioning, though. In the future all consumer AMD CPUs will be APUs.

Anyway, even if you are a gamer it's the GPU that nearly always bottlenecks your gaming so my simplistic answer is that a 5600g should be fine unless you're going for an all-round high end system.
I was planning on going 1080p but I'm considering going to 1440p because the bottleneck will be on the GPU instead of the CPU.
For what it's worth, I'm doing a mid-range build with a 6600xt in mind.
 
If you want to see integrated graphics rival dedicated, just look at the Xbox Series X and PS5.
Oh, but look at their memory layout. That's the difference, it's what you see on a GPU meaning a 32bit channel per chip and multiply that by the amount of chips. On PC dual channel (let's say 2x8GB sticks) RAM means a 128bit bus. Make it 4x8GB, filling all the slots, and it's still a 128bit bus. It just doubles the amount of memory on each channel, not providing a wider bus. That's what is holding APUs back in my opinion, they have to grow with the available bandwidth and a 128bit memory interface is pretty anemic if that bandwidth is always contested by the CPU, I don't think I used that word right but it sounds ok in my head.

But in the future, couple of generations from now, put a stack or two of HBM on that APU in addition to a large cache... That would change everything.
 

Makes sense this time around to launch the halo product first, give more time for the 30-series to sell out, and lead in benchmarks for at least until RDNA3 shows up. RTX 3080 launched just before the RTX 3090.

A 48 GB model is rumored, which could be good for non-gaming workloads. AMD might want to launch a 32 GB version of the 7900/7950 XT.
 

Makes sense this time around to launch the halo product first, give more time for the 30-series to sell out, and lead in benchmarks for at least until RDNA3 shows up. RTX 3080 launched just before the RTX 3090.

A 48 GB model is rumored, which could be good for non-gaming workloads. AMD might want to launch a 32 GB version of the 7900/7950 XT.
It also allows them to bin and build up enough chips for the wider launch of more normal cards. And stockpile some memory if that's required.

A 48GB 4090 seems like a prosumer product and that makes sense in a way. It would likely mean 4GB GDDR6(X?) chips and are those available at this point?
 
It also allows them to bin and build up enough chips for the wider launch of more normal cards. And stockpile some memory if that's required.

A 48GB 4090 seems like a prosumer product and that makes sense in a way. It would likely mean 4GB GDDR6(X?) chips and are those available at this point?
The 3090/Ti definitely occupies the "Titan" space that was prosumer. Maybe they won't make another Titan, or use the name again when things get stale.

The 3090 used 24x 1GB GDDR6X chips placed on both sides of the card, and the 3090 Ti used 12x 2GB GDDR6X on one side. I assume they would use 24x 2GB for this hypothetical 4090 variant.
 
Holy moly, that's a lot of VRAM. What's that much even for?
Almost nothing related to gaming, outside of 8K in some titles or something tweaked or modded to use a lot. Here's some people trying to come up with uses for the 3090's 24 GB:


It's more useful for stuff like CAD, machine learning, and deepfakes than gaming. In some cases you could tweak your settings to easily use up all of the VRAM (Ctrl+F for VRAM on this MrDeepFakes guide). The professional and content creation angle leads to GPUs like this being labeled "prosumer". It's not a workstation Quadro GPU, but it's encroaching on the same territory.

It's likely that game engines will eventually be able to take advantage of 24-48 GB in the future. Just detect that it's available and use it in an inefficient way that caches more stuff and boosts performance a little. But the same engines will have to scale to work on much weaker hardware with less or zero VRAM.
 
Last edited:
Back