GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • Informative
Reactions: Dixieland Buckaroo
Are there any new releases maxing out the new GPUs at 1080p? Or is max out decidedly in the realm of 4k?
Vram is a hot subject atm. The 3070 is getting wrecked in some newer titles at 1080p ultra. This is starting to cast some doubt on how well 12gb will fare in 2 years if you want longevity.

Important to note that dlss doesn't reduce vram much at all.

*Edit* It seems that Nvidia is only interested in selling a "great for right now" product. Like yes, the 4070 can get some big 1440p numbers with upscaling and frame generation...but those don't really reduce vram. If a title comes out in 2 years that eats up that 12gb, it will drop to its knees. But Nvidia doesn't care about that. They want you buying a new card in 2 years. They are avoiding another 1080ti situation at all costs.

*Edit 2* Also of note that all the reviews, which have been pretty meh so far, aren't using these newer, high vram using titles. They're still using older stock such as CP2077. Like, god, they're still using Tomb Raider and Watch Dogs Legion ffs.
 
Last edited:
Vram is a hot subject atm. The 3070 is getting wrecked in some newer titles at 1080p ultra. This is starting to cast some doubt on how well 12gb will fare in 2 years if you want longevity.

"Getting wrecked" as in "frame rates aren't playable," or "still well above my monitor's refresh rate?" FWIW, I'm finding in a fair number of games, I can't even tell the difference between max texture res and 1 level down, so I knock it down just to save the environment.

Considering how few gamers have brand-new hardware, if somebody's making a game in 2025 that won't run on a 2-year-old high-end desktop, that's somebody who's just not going to sell any software.

*Edit 2* Also of note that all the reviews, which have been pretty meh so far, aren't using these newer, high vram using titles. They're still using older stock such as CP2077. Like, god, they're still using Tomb Raider and Watch Dogs Legion ffs.

Based on Steam statistics, your GPU really only has to be able to handle Source Engine games that are old enough to start applying to college to sell.

Do laptops use a different driver than desktop?

No idea, and it gets better - I tried to update the AMD Software to the Adrenalin Edition, and now it's just fucking gone. I mean, it's installed on my system, the directories are full, but the application is now gone from view. Only thing available now is the AMD Bug Report Tool. AMD software is so horrible.
 
Last edited:
"Getting wrecked" as in "frame rates aren't playable," or "still well above my monitor's refresh rate?" FWIW, I'm finding in a fair number of games, I can't even tell the difference between max texture res and 1 level down, so I knock it down just to save the environment.
Yes. As in "frame rates aren't playable".

An RX6800 is doing ray tracing BETTER than a 3070. In some cases the 3070 is dropping to significantly worse textures to keep up fps. At other times the max fps might be within the same ballpark, but the lows drop off considerably.

8gb is dead for midrange.

When these cards were hot, a lot of people would pass on 6800s for the 3070 because of RT and whatnot. Well turns out that was a really bad call.
 
Last edited:
Everspace 2 uses between 7 to 10 GBs of VRAM, even with dlss on. Shits wild
I honestly didn't know it was getting this bad. Yeah, the 3070 had some struggles with Hogwarts, but now the can of worms is getting opened. It's just doing bad consistently on new titles. Bad port or not, higher vram is proving to be a good measure of performance again, and fancy AI learning is not going to lower usage by much.

Nvidia is selling inferior hardware backed by good (for now) software. I do not like it.

*Edit* Even on the nvidia subreddit people are just saying turn off rt/ drop down from ultra. 1 generation of new releases is all it took to take the 3070 from "high fps 1440p bliss" to "just go to to high 1080p, no rt bro", whereas the 6800 is aging much better.
 
Last edited:
  • Informative
Reactions: Kane Lives
Yes. As in "frame rates aren't playable".

I just clicked through to time stamps, but it looked like all but one game averaged over 60 fps, with lows in the 50s, at 1080p. The "anything less than 120 fps at ultra is unplayable" crowd is a rounding error of irrelevance. Ultra's probably overkill at 1080p anyway. Ultra in GTA V isn't the same as ultra textures in Cyberpunk 2077. In some of the titles being tested, 4K was an afterthought, whereas the newest titles are more likely to have assets at "Ultra" that really do make good use of 4K...in which case running Ultra at 1080p is overkill.


An RX6800 is doing ray tracing BETTER than a 3070. In some cases the 3070 is dropping to significantly worse textures to keep up fps. At other times the max fps might be within the same ballpark, but the lows drop off considerably.

An RX6800 has 50% more transistors than a 3070 and 14% more memory bandwidth. If it's not better at something, AMD's engineers should quit their jobs.

8gb is dead for midrange.

When these cards were hot, a lot of people would pass on 6800s for the 3070 because of RT and whatnot. Well turns out that was a really bad call.

Paying a price premium for a logo that doesn't deliver additional performance has always been a bad call. Gamers usually seem to have really weird ideas about computer hardware and will drop enormous amounts of money on placebos.

(Source: I bought an overpriced placebo for no real reason other than it is small and has a skull on it.)
 
An RX6800 has 50% more transistors than a 3070 and 14% more memory bandwidth. If it's not better at something, AMD's engineers should quit their jobs.
You missed the point. The 6800 was always better at rasterization. Now its better at ray tracing. You know, the thing Nvidia has been banking everything on?

The script has been flipped just 1 generation later because of vram limitations.
 
Back