GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
Ehhhh, probably $30-50 too much right?
The +8 GB GDDR6 probably costs like $20, so having an $80-100 gap between the two cards would be lunacy.

But on the other hand that gap is going to happen anyway as the 9060 XT 16 GB pricing shoots up and 8 GB stays near MSRP.

LIVESTREAM CRASHED LMAO

I really don’t get why they even make the 8GB variants. They’re great if you only play Minecraft, Valorant, and LoL I guess.
Despite all the hand wringing it's arguably fine for many games at 1080p.
Examples:
Cyberpunk 2077 - 1080p Ultra, 1440p High
Forza Motorsport - 1440p High
Last of Us Part 1 - 1080p High
Avatar: Frontiers of Pandora - 1080p Medium
Homeworld 3 - 1080p/1440p Medium
Hogwarts Legacy - 1080p/1440p High
Starfield - Literally anything 1080p/1440p, maybe 4K Medium/High
Horizon Forbidden West - 1080p/1440p High, 4K Medium
Senua's Saga Hellblade 2 - 1080p Medium
Ghost of Tsushima - 1080/1440p Very High, 4K High
Alan Wake 2 - 1080p High (barely)

Ratchet & Clank needs more than 8 GB or problems happen.

 
Last edited:
  • Informative
Reactions: DavidS877
They bully each other for being poor/rigless/brown/whatever too, it's pretty funny.
But your average Wccftech poster is poor, rigless, and brown tho

I really don’t get why they even make the 8GB variants. They’re great if you only play Minecraft, Valorant, and LoL I guess.
I mean, there's a huge segment of the population that just wants to play e-sports casual games and/or low settings. The problem is more the price.
 
The +8 GB GDDR6 probably costs like $20, so having an $80-100 gap between the two cards would be lunacy.

But on the other hand that gap is going to happen anyway as the 9060 XT 16 GB pricing shoots up and 8 GB stays near MSRP.

LIVESTREAM CRASHED LMAO


Despite all the hand wringing it's arguably fine for many games at 1080p.
Examples:
Cyberpunk 2077 - 1080p Ultra, 1440p High
Forza Motorsport - 1440p High
Last of Us Part 1 - 1080p High
Avatar: Frontiers of Pandora - 1080p Medium
Homeworld 3 - 1080p/1440p Medium
Hogwarts Legacy - 1080p/1440p High
Starfield - Literally anything 1080p/1440p, maybe 4K Medium/High
Horizon Forbidden West - 1080p/1440p High, 4K Medium
Senua's Saga Hellblade 2 - 1080p Medium
Ghost of Tsushima - 1080/1440p Very High, 4K High
Alan Wake 2 - 1080p High (barely)

Ratchet & Clank needs more than 8 GB or problems happen.
Yea I get that. But having that low amount of vram also limits the usage of the features your card comes with. The new AMD cards boast improved RT performance, but turning on RT and utilizing frame gen features requires even more VRAM. But maybe $300 for a GPU that you can get two solid years out of (before next gen games developed around next gen console specs are the norm) is good value now.
 
Yea I get that. But having that low amount of vram also limits the usage of the features your card comes with. The new AMD cards boast improved RT performance, but turning on RT and utilizing frame gen features requires even more VRAM. But maybe $300 for a GPU that you can get two solid years out of (before next gen games developed around next gen console specs are the norm) is good value now.
Hopefully this is the last generation that we see overpriced 128-bit 8 GB GPUs, since with 3 GB GDDR7, they can give it 12 GB instead.

Another thing to point out: the 9060 XT 8 GB will have PCIe 5.0 x16 (confirmed in Gamers Nexus video). That might help it compared to x8 cards from Nvidia with 8 GB, not sure.
 
I don't think any 8 GB card should be above 200 and I think the 8GB variant is just more junk being dumped on the market to take advantage of less savvy consumers.
If 1080p performance ends up near a 7700 XT (when VRAM is not an issue), and the price drops to $250, it could be a good upgrade for some people (such as those with sub-8GB cards and PCIe 3.0 systems). The 7700 XT has ~69% more compute units (54 vs. 32) so it would be a little crazy if either 9060 XT matches it.
 
I really don’t get why they even make the 8GB variants. They’re great if you only play Minecraft, Valorant, and LoL I guess.
Because some people have nigger-tier finances and will do anything to save $30.

Nice breakdown of FSR4. TL;DR - it's almost as good as the old CNN-based DLSS, but not the new transformer-based DLSS.

GamersNexus has a video showing the extrapolated frames are much cleaner now, with far less ghosting:
 
V-Color puts displays on memory modules
Note that the black bands on the screens are a result of their refresh rates and not a failure.

umTVa6xScQojz6hzcEvUZW-650-80.jpg.webptDmbdY58kKsimrKjJH5RHZ-970-80.jpg.webp
 
I can't help but notice that only one screen can be seen when the ram is packed in like that, making the other three useless and an unneeded cost and heat source
If there was a display that sat on top that clipped in to the ram it would make more sense. The way its setup also assumes you leave your computer on your desk.
 
Companies need to stop with the RAM gimmicks.
I only buy Crucial RAM for just this reason. They've got heat spreaders which is meh, but they're good-looking and don't really alter the profile of the DIMMs at all.
 
I'd buy that shit for my main rig in an instant, best case 2025.

Shame that SilverStone has bad product availability here, as well as high prices. I'd love to build a NAS with one of their cases designed for that with the hotswap trays and whatnot, but it's basically unobtainium, and when it is it's so expensive it's ridiculous. Fantastic designs, but they really need to get a better presence in Europe.
 
I can't help but notice that only one screen can be seen when the ram is packed in like that, making the other three useless and an unneeded cost and heat source
not like 4 sticks of DDR5 is even usually usable unless you’re on a HEDT platform
 
Back