GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
on one hand nshitia actually appears to be dishing out the goods to some extent... but they're killing support for a 10 year old os and i'm the only one who cares... hummmm
 
  • Feels
Reactions: Smaug's Smokey Hole
$499 -> $699
$699 -> $999
$1499 -> $1999

The OEMs are going to jack up the prices of the cards. Is the 3090 the only one that has that new connector on it?
 
  • Like
Reactions: JoseRaulChupacabra
I was toying with the idea of getting an Intel NUC Hades Canyon...
Totally wacky build - hybrid single-board , Intel and AMD.
Price for performance just wasn't there. After adding proper memory and decent SSD, you are topping $1100. That's when I stopped.
Nothing is jaw-dropping about it, except the price.

For LESS than $1100, I built an awesome full tower. I went AMD, no probs with drivers - but I also have NVIDIA on 2 other PCs. I decided to put down the torches and buy whatever is on sale/best value.

For price- Intel/NVIDIA always costs more, IMHO
 
Oof. Rough to people like me who bought a 2080 Ti 14 months ago. Seems like a big generational jump. They've also priced me out of the top end card - the RRP for the 3090 is something like AU$2,450, whereas the 2080 Ti was closer to AU$1,850. Still, I can't imagine the better last gen cards will run games at 1440p more than adequately for a while yet...
 
Oof. Rough to people like me who bought a 2080 Ti 14 months ago. Seems like a big generational jump. They've also priced me out of the top end card - the RRP for the 3090 is something like AU$2,450, whereas the 2080 Ti was closer to AU$1,850. Still, I can't imagine the better last gen cards will run games at 1440p more than adequately for a while yet...
>TFW you got a 2060 super in January
 
Holy shit was i wrong cant believe its a legit old school improvement between generations the least expensive gpu would still more than double what i got. Fuck the 3090 would be like quadriple, i might legitimately need to get more ram or a new motherboard or something to avoid bottlenecks somewhere. Imagine showing the Pixar animators that 25 years later someone couldbuy off the shelf better processors and make their own toy story. Animating 24 frames at 4k could literally be done at real time on this.
 
Imagine showing the Pixar animators that 25 years later someone couldbuy off the shelf better processors and make their own toy story. Animating 24 frames at 4k could literally be done at real time on this.
You're right about the magnitude of the performance bump, but even an animation of quality similar to the original Toy Story probably couldn't be rendered in real-time using one of these new GPUs. The real-time ray tracing we've seen hitherto is typically limited to one or two bounces (imagine a ray extends from the light source, then bounces off the first light source, then subsequent rays are drawn from the site of that bounce, etc.). The main reason Toy Story still looks so good isn't just high-resolution textures or physics simulation, but that the number of bounces is really really high -- like 20 bounces (this is just my guess), and each bounce is more demanding in terms of performance than the preceding bounce.

You could certainly render something significantly better-looking than like, the Jimmy Neutron TV show, though.
 
>TFW you got a 2060 super in January
I got a 2070 Super about 4 months ago, only because I just needed to retire ny 5 year old jet engine called a titan x so I know how you feel. I haven't even broken in the card yet either.

Hopefully by the time I'm ready to move to this series, Nvidia releases the 3080ti cause the gap between the 3080 and 3090 tells me there will 100% be one.
 
I got a 2070 Super about 4 months ago, only because I just needed to retire ny 5 year old jet engine called a titan x so I know how you feel. I haven't even broken in the card yet either.

Hopefully by the time I'm ready to move to this series, Nvidia releases the 3080ti cause the gap between the 3080 and 3090 tells me there will 100% be one.
3070, 3070 super, 3080, 3080 super, 3080ti, 3090, 3090 super, 3090ti.

Theyve allowed enough gaps for AMD to try and market into those spots in between, and then try and bury them after AMD's release.
 
I have a 1080 and a 7700K, will a 3080 bottleneck me or should I be fine?

Or is this going to be something where I have to wait and see when other people do the guinea pig work?

It scales two ways, your current CPU will be a bottleneck for frame rate(won't run at 300fps or whatever Nvidia said) but you can compensate for that by piling on more work for the GPU until they both struggle to go higher than X fps. A 7700k and a 3080(if performance claims are true) means that you can just go nuts with resolution and graphics options. Fuck it, go into the Nvidia control panel and force a bunch of more bullshit to run.

If nothing else you will likely be able to have a stellar card until 2025, with the exception of VRAM, it's about time for that to go up now that new consoles arrive. PC ports usually add option to run with a higher graphical setting, consoles often being the equivalent of the medium-high, so if a new PS5/Xbox game uses 11GB of RAM for graphics what would be required of a PC graphics card?

Just speculation, a new truly new hardware cycle is about to begin though.

You're right about the magnitude of the performance bump, but even an animation of quality similar to the original Toy Story probably couldn't be rendered in real-time using one of these new GPUs. The real-time ray tracing we've seen hitherto is typically limited to one or two bounces (imagine a ray extends from the light source, then bounces off the first light source, then subsequent rays are drawn from the site of that bounce, etc.). The main reason Toy Story still looks so good isn't just high-resolution textures or physics simulation, but that the number of bounces is really really high -- like 20 bounces (this is just my guess), and each bounce is more demanding in terms of performance than the preceding bounce.

You could certainly render something significantly better-looking than like, the Jimmy Neutron TV show, though.

Toy Story wasn't raytraced though. iirc Pixar relied heavily on raycasting at the time, Renderman really wasn't what it is today. I can't produce any evidence for this, just going off old memories, but in my opinion the lighting of the characters, especially in the street scene, shows a pretty obvious three light setup that was often used in raycasting to compensate for not having all those ray bounces and everything it enables when it comes to realistic light.
toy-story-1024x603.jpg
Where does the strong purple fill come from? And look at the stems of those trees, that's OG Xbox, plus they cast no shadows as you would expect.

If someone wanted the best quality raytracing Mental Ray was used. Coincidentally Nvidia ended up owning Mental Ray, hm...
 
It scales two ways, your current CPU will be a bottleneck for frame rate(won't run at 300fps or whatever Nvidia said) but you can compensate for that by piling on more work for the GPU until they both struggle to go higher than X fps. A 7700k and a 3080(if performance claims are true) means that you can just go nuts with resolution and graphics options. Fuck it, go into the Nvidia control panel and force a bunch of more bullshit to run.
I don't care so much about fps past 144 as I don't even have a monitor higher than 60hz right now, so bottlenecking on fps is completely fine with me. I think I'll get the new monitor and then when benchmarks of different configurations get tested I'll be able to see if I need a full upgrade or not. I do very much appreciate the prices that I've been seeing.
 
Am I the only one still ass blasted that pricing for GPUs have gone up in general? Years ago, a 70 tier card can be had for the price of a 50 ti. I guess masks aren't the only part of this new normal. Every tech tuber is celebrating the pricing.

Toy Story wasn't raytraced though. iirc Pixar relied heavily on raycasting at the time, Renderman really wasn't what it is today. I can't produce any evidence for this, just going off old memories, but in my opinion the lighting of the characters, especially in the street scene, shows a pretty obvious three light setup that was often used in raycasting to compensate for not having all those ray bounces and everything it enables when it comes to realistic light.
A few years back, IIRC, Renderman sunset its REYES rendering, which is a different from ray tracing. It's now, AFAIK, similar to Arnold in that its a full on brute force ray tracer with no GI hacks.
 
Am I the only one still ass blasted that pricing for GPUs have gone up in general? Years ago, a 70 tier card can be had for the price of a 50 ti. I guess masks aren't the only part of this new normal. Every tech tuber is celebrating the pricing.

It really boggles the mind. The Voodoo 2 was something no one could compete with and it launched at $249 for the 8MB version, buy another one on the cheap a while later and their SLI implementation actually doubled performance. Couple of years later the GeForce 256 DDR, first T&L card on the market, launched at "a steep asking price of $300". Like you say, what do you get for 300 today and how does it compare to the high-end?

CPUs have gone the other way around with their pricing, a Ryzen 7 3700 can be had for $300 and that's a lot of bang for your buck.
 
It really boggles the mind. The Voodoo 2 was something no one could compete with and it launched at $249 for the 8MB version, buy another one on the cheap a while later and their SLI implementation actually doubled performance. Couple of years later the GeForce 256 DDR, first T&L card on the market, launched at "a steep asking price of $300". Like you say, what do you get for 300 today and how does it compare to the high-end?

CPUs have gone the other way around with their pricing, a Ryzen 7 3700 can be had for $300 and that's a lot of bang for your buck.

I remember when I bought my 970 five years ago, a "mid tier" card for $300.. I thought that was way too much for what you get and it turns out nVidia lied about 4gb video ram besides.
 
Here's a hot take: None of you fuckers have any real use for the cards that are coming out, most new games are garbage anyway, the GTX 780 Ti is still good enough for most things, 4K is useless, hell I'm still more than happy with 720p. Literally the only people who need these cards are people with VR and people working on machine learning.
 
Here's a hot take: None of you fuckers have any real use for the cards that are coming out, most new games are garbage anyway, the GTX 780 Ti is still good enough for most things, 4K is useless, hell I'm still more than happy with 720p. Literally the only people who need these cards are people with VR and people working on machine learning.
While you're not wrong there is the mentality that "if it's affordable why not get it?" as well as future proofing against the ever increasing graphical requirements and advancements. I do agree that 4K is a bit of a meme, but 1080p144hz is increasingly achievable.
 
While you're not wrong there is the mentality that "if it's affordable why not get it?" as well as future proofing against the ever increasing graphical requirements and advancements. I do agree that 4K is a bit of a meme, but 1080p144hz is increasingly achievable.
'if it's affordable why not get it' is just how you end up spending money on things you don't need and may never need, instead of buying the affordable thing now it'd be better to wait until the future comes and then buy exactly what you need when you need it.
 
Back