GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

hey friendos, im looking to upgrade my computer as my current one cant handle elden ring

but i feel i got hosed with a prebuilt i got ages ago.

anyone able to point me in the right direction to either build one or some prebuilders who arent scammers (im in australia)

cheers mates
Since you've been out of the game for a while, be aware that Intel is currently a Bad Idea™ and you should steer clear of them. Their current desktop CPUs have issues with catastrophic degradation and their support has not been good.

If you're mostly interested in vidya, get a b650 board with a 7800X3D for the CPU. The 7800X3D is the current future-proof gaming champion and it's pretty affordable. For GPU, that's mostly a matter of how much you're willing to suffer but I wouldn't go for the current low-end cards (7600 XT and RTX 4060). You're better off just buying used midrange from last-gen if you're aiming to spend less than 500 USD on a GPU.
 
Last edited:
anyone able to point me in the right direction to either build one or some prebuilders who arent scammers (im in australia)
OzBargain has a section for pre-builts https://www.ozbargain.com.au/tag/desktop-computer
Generally any that are trash get shit on in the comments. Avoid systems from large companies that use proprietary parts (e.g. Dell and HP, probably Lenovo too) so it's easier to upgrade down the track.

Deals on parts seem pretty limited. PC Part Picker works ok but they grab prices from a lot of little shops whose shipping costs ruin the deal so definitely curate the list. staticICE also still works great if you're searching for the lowest price for any given item.
 
  • Like
Reactions: Salt_Merchant
OzBargain has a section for pre-builts https://www.ozbargain.com.au/tag/desktop-computer
Generally any that are trash get shit on in the comments. Avoid systems from large companies that use proprietary parts (e.g. Dell and HP, probably Lenovo too) so it's easier to upgrade down the track.

Deals on parts seem pretty limited. PC Part Picker works ok but they grab prices from a lot of little shops whose shipping costs ruin the deal so definitely curate the list. staticICE also still works great if you're searching for the lowest price for any given item.
go to profile - account - merchants and pricing and uncheck the companies you don't want. you can even set a tax rate for each one if that's useful to you
 
probably Lenovo too
Lenovo's Legion Towers use standard form factor parts and have surprisingly good airflow for a big-name OEM. The only real problem I have with them is the BIOS you get which simplifies everything into three settings (eco, balanced, performance) but it's fine if you're not tinkering. Their ThinkStation line is where they put all their cursed nonstandard parts.
 
for regular usage and vidya is it even worth going AM5? I'm rocking a 5600x and a 3080FE, I've got my eye on a 5800x3d but i'd rather not upgrade to AM5 if there isn't a major benefit.
 
for regular usage and vidya is it even worth going AM5? I'm rocking a 5600x and a 3080FE, I've got my eye on a 5800x3d but i'd rather not upgrade to AM5 if there isn't a major benefit.
I guess it would sort of depend on a bunch of different factors, but the 5800x3d is only $30 cheaper than the 7800x3d. You can get a decent AM5 board for like $150. Also, the 9000 series x3d chips will release later this year if you want to save and wait.
 
I'm not sure who would kick 13th/14th gen off the charts completely for the upcoming reviews. Maybe Gamers Nexus? Would be funny.
Turns out the winner of this was a complete dark horse, German overclocker derBauer kicked them off the charts completely, arguing that they might see major performance drops after the microcode update is out. More important reviewer than one might imagine, he’s the guy behind ThermalGrizzly and seems to have a fundamental understanding of microarchitecture design, unlike most other reviewers.
 
I've complained before that the 12 GB of RAM in the Radeon 6700 XT never seems to be useful. A recent example is No Man's Sky. With everything except textures on Ultra, it stays under 8 GB and runs at 90 fps. Bump up the textures to ultra, and now it's consuming 11.5 GB and running at 30-40 fps inside the hangar of my capital ship.
It's GDDR6 and not GDDR6X right? You're going to have a drop in RAM speed compared to something like a 4070/ 4070 super which has 12GB of ram as well, but in the GDDR6X format. Related question, how much system ram do you have? That can affect your textures as well.
 
It's GDDR6 and not GDDR6X right? You're going to have a drop in RAM speed compared to something like a 4070/ 4070 super which has 12GB of ram as well, but in the GDDR6X format. Related question, how much system ram do you have? That can affect your textures as well.

The context here is AMD cards were praised for generally having more memory than NVIDIA cards at a better price. At the time, I think this card was around $400. Same generation of the 6700 XT is the 3070 Ti, not the 4070.

6700 XT: 384 GB/s
3060 TI: 448 GB/s
3070 Ti: 608.3 GB/s

So the 6700 XT's main memory isn't even as fast as the 3060 Ti's. I have found pretty consistently that any game I have which is capable of using more than 8 GB of VRAM runs like shit on this card. Fiddling with settings suggests that indeed, it really is a bandwidth problem. The 12 GB VRAM is a pretty useless feature. Games designed to run on this generation of hardware are designed to fit inside 8 GB, and if they can be bumped up to use more resources, this card's memory bandwidth can't handle it.
 
The context here is AMD cards were praised for generally having more memory than NVIDIA cards at a better price. At the time, I think this card was around $400. Same generation of the 6700 XT is the 3070 Ti, not the 4070.

6700 XT: 384 GB/s
3060 TI: 448 GB/s
3070 Ti: 608.3 GB/s

So the 6700 XT's main memory isn't even as fast as the 3060 Ti's. I have found pretty consistently that any game I have which is capable of using more than 8 GB of VRAM runs like shit on this card. Fiddling with settings suggests that indeed, it really is a bandwidth problem. The 12 GB VRAM is a pretty useless feature. Games designed to run on this generation of hardware are designed to fit inside 8 GB, and if they can be bumped up to use more resources, this card's memory bandwidth can't handle it.
I could only recommend upgrading to a 4070 series card to keep parity with what you have man. Ive been able to play literally anything with my 4070 super. That bandwidth is atrocious. It explains why you're getting throttled to death.
 
I'm just posting information, not asking for purchasing advice. I'm very aware that if I spend over $500, I can look at nicer textures.
Fair enough lol. But seriously that sucks. I didn't know that GDDR6X and GDDR6 had so big a gap.
 
First Intel fucks up and now AMD:

The 7800X3D must be a blessing AND a curse. The relatively only good AM5 CPU (good luck finding replacements should your PC age). Should have stuck with AM4.

Nah man... Some of these "tech" reviewers are just going for clickbait bullshit.

Is the Zen 5 vs Zen 4 as impressive as Zen 1 vs Bulldozer? Nope.

But it delivers solid performance improvements 10-15% or more, using A LOT less power. In some cases 40% less power.

The new Ryzen basically delivers performance as good or better than their previous CPUs did. But with a TDP of 65W instead of 90W.

It's hella impressive, even if some of you nerds wanted them to focus more on performance rather than performance per W. I reckon it's because of the server market, where Threadripper already has a +20% installbase.

(Which I reckon will grow very fast now. If you're running a datacenter, the price of a CPU, or even of thousands of CPU's is marginal compared to what you'll pay for the power to them 24 hours a day for 3 or 5 years. Even a 10% reduction is big money, let alone 30% or 40%.)
 
Fair enough lol. But seriously that sucks. I didn't know that GDDR6X and GDDR6 had so big a gap.

It's more than that. Each pin in the interface is capable of certain amount of bandwidth. How many pins you have depends on how big you build the interface on the processor. More silicon used for the memory interface means less silicon used for processing units (it also means a more expensive PCB), so it's always a judgment call as to how to balance those demands. AMD seems to have made the wrong call with the 6700XT.

InterfacesBus width =
Ifaces x 32b
Bandwidth/pinTotal bandwidth =
(Bus width * Bw/pin ) /
(8 b per B)
6600 XT4128b16 Gb/s256 GB/s
6700 XT6192b16 Gb/s384 GB/s
3060 Ti8256b14 Gb/s448 GB/s
3070 Ti8256b19 Gb/s608 GB/s
40604128b17 Gb/s256 GB/s
40706192b21 Gb/s504 GB/s

What I have found pretty consistently with this card:
  • Struggles at 4K
  • FSR looks like shit when it's at a setting that makes a difference
  • Struggles when textures exceed 8 GB
It does best at 1440p and keeping the total VRAM used to under 8 GB, which is just disappointing compared to how it was advertised.
 
  • Informative
Reactions: WelperHelper99
It's hella impressive, even if some of you nerds wanted them to focus more on performance rather than performance per W. I reckon it's because of the server market, where Threadripper already has a +20% installbase.
The server version is EPYC, Threadripper is just scaled down EPYC for workstations/gaming PCs for idiots.

Anyway, you can get a huge performance boost in zen5 from setting PBO to unlimited. The extra efficiency pays off for performance seekers too, in the form of better overclocking results (though really I wouldn't call it overclocking, just raising the factory limits, it's all automatic and according to Wendell doesn't result in any particular instability). In this age, frankly, efficiency is going to be the main way to boost power. We can see this from 13th/14th gen vs Zen4. AMD have kept to a fairly efficient path, and their CPUs perform wonderfully and can sip power (7950X limited to 105W only loses 5% performance compared to the default 170W, limited to 65W loses 20%). Intel keep pushing more power into their chips, and the result is they quite literally burn out, while also not performing notably better than the competition.

The reason is simple: AMD are much smaller than Intel. Intel can afford to keep separate server/PC lines, while AMD can only really afford to do one or the other. So AMD made a terrific server chip and implemented it as chiplets. You can get a highly efficient server by putting half a dozen chiplets on one processor, or you can get a quite performant and still fairly efficient PC by putting only one or two chiplets on the processor, and pumping a bit more power in instead so they can reach higher clocks.
 
  • Like
Reactions: Flaming Dumpster
Chiplets don't improve power efficiency. If anything, they have a slight efficiency cost due to losses at the interfaces. The biggest benefit of chiplets was it lowered production and R&D costs. AMD's been able to hit server, desktop, laptop, and workstation with an R&D team about 1/10th the size of Intel's. End result was that a 36-core Ice Lake CPU (monolithic 7nm die) cost about $1000 more than a 32-core EPYC Rome CPU (7nm compute + 14nm IO), and Rome could go all the way to 64 cores.
 
Chiplets don't improve power efficiency. If anything, they have a slight efficiency cost due to losses at the interfaces. The biggest benefit of chiplets was it lowered production and R&D costs. AMD's been able to hit server, desktop, laptop, and workstation with an R&D team about 1/10th the size of Intel's. End result was that a 36-core Ice Lake CPU (monolithic 7nm die) cost about $1000 more than a 32-core EPYC Rome CPU (7nm compute + 14nm IO), and Rome could go all the way to 64 cores.
You're right, I didn't mean to imply that EPYC is efficient because it has chiplets, I meant that the chiplets are individually efficient, and that implementing the processor this way was clever as it let you use the same microarchitecture across the entire market. Desktop Ryzen is fundamentally the same as server EPYC, the latter simply has more dies.
 
Back