- Joined
- Jan 31, 2018
on one hand nshitia actually appears to be dishing out the goods to some extent... but they're killing support for a 10 year old os and i'm the only one who cares... hummmm
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
>TFW you got a 2060 super in JanuaryOof. Rough to people like me who bought a 2080 Ti 14 months ago. Seems like a big generational jump. They've also priced me out of the top end card - the RRP for the 3090 is something like AU$2,450, whereas the 2080 Ti was closer to AU$1,850. Still, I can't imagine the better last gen cards will run games at 1440p more than adequately for a while yet...
You're right about the magnitude of the performance bump, but even an animation of quality similar to the original Toy Story probably couldn't be rendered in real-time using one of these new GPUs. The real-time ray tracing we've seen hitherto is typically limited to one or two bounces (imagine a ray extends from the light source, then bounces off the first light source, then subsequent rays are drawn from the site of that bounce, etc.). The main reason Toy Story still looks so good isn't just high-resolution textures or physics simulation, but that the number of bounces is really really high -- like 20 bounces (this is just my guess), and each bounce is more demanding in terms of performance than the preceding bounce.Imagine showing the Pixar animators that 25 years later someone couldbuy off the shelf better processors and make their own toy story. Animating 24 frames at 4k could literally be done at real time on this.
I got a 2070 Super about 4 months ago, only because I just needed to retire ny 5 year old jet engine called a titan x so I know how you feel. I haven't even broken in the card yet either.>TFW you got a 2060 super in January
3070, 3070 super, 3080, 3080 super, 3080ti, 3090, 3090 super, 3090ti.I got a 2070 Super about 4 months ago, only because I just needed to retire ny 5 year old jet engine called a titan x so I know how you feel. I haven't even broken in the card yet either.
Hopefully by the time I'm ready to move to this series, Nvidia releases the 3080ti cause the gap between the 3080 and 3090 tells me there will 100% be one.
I have a 1080 and a 7700K, will a 3080 bottleneck me or should I be fine?
Or is this going to be something where I have to wait and see when other people do the guinea pig work?
You're right about the magnitude of the performance bump, but even an animation of quality similar to the original Toy Story probably couldn't be rendered in real-time using one of these new GPUs. The real-time ray tracing we've seen hitherto is typically limited to one or two bounces (imagine a ray extends from the light source, then bounces off the first light source, then subsequent rays are drawn from the site of that bounce, etc.). The main reason Toy Story still looks so good isn't just high-resolution textures or physics simulation, but that the number of bounces is really really high -- like 20 bounces (this is just my guess), and each bounce is more demanding in terms of performance than the preceding bounce.
You could certainly render something significantly better-looking than like, the Jimmy Neutron TV show, though.
I don't care so much about fps past 144 as I don't even have a monitor higher than 60hz right now, so bottlenecking on fps is completely fine with me. I think I'll get the new monitor and then when benchmarks of different configurations get tested I'll be able to see if I need a full upgrade or not. I do very much appreciate the prices that I've been seeing.It scales two ways, your current CPU will be a bottleneck for frame rate(won't run at 300fps or whatever Nvidia said) but you can compensate for that by piling on more work for the GPU until they both struggle to go higher than X fps. A 7700k and a 3080(if performance claims are true) means that you can just go nuts with resolution and graphics options. Fuck it, go into the Nvidia control panel and force a bunch of more bullshit to run.
A few years back, IIRC, Renderman sunset its REYES rendering, which is a different from ray tracing. It's now, AFAIK, similar to Arnold in that its a full on brute force ray tracer with no GI hacks.Toy Story wasn't raytraced though. iirc Pixar relied heavily on raycasting at the time, Renderman really wasn't what it is today. I can't produce any evidence for this, just going off old memories, but in my opinion the lighting of the characters, especially in the street scene, shows a pretty obvious three light setup that was often used in raycasting to compensate for not having all those ray bounces and everything it enables when it comes to realistic light.
Am I the only one still ass blasted that pricing for GPUs have gone up in general? Years ago, a 70 tier card can be had for the price of a 50 ti. I guess masks aren't the only part of this new normal. Every tech tuber is celebrating the pricing.
It really boggles the mind. The Voodoo 2 was something no one could compete with and it launched at $249 for the 8MB version, buy another one on the cheap a while later and their SLI implementation actually doubled performance. Couple of years later the GeForce 256 DDR, first T&L card on the market, launched at "a steep asking price of $300". Like you say, what do you get for 300 today and how does it compare to the high-end?
CPUs have gone the other way around with their pricing, a Ryzen 7 3700 can be had for $300 and that's a lot of bang for your buck.
While you're not wrong there is the mentality that "if it's affordable why not get it?" as well as future proofing against the ever increasing graphical requirements and advancements. I do agree that 4K is a bit of a meme, but 1080p144hz is increasingly achievable.Here's a hot take: None of you fuckers have any real use for the cards that are coming out, most new games are garbage anyway, the GTX 780 Ti is still good enough for most things, 4K is useless, hell I'm still more than happy with 720p. Literally the only people who need these cards are people with VR and people working on machine learning.
'if it's affordable why not get it' is just how you end up spending money on things you don't need and may never need, instead of buying the affordable thing now it'd be better to wait until the future comes and then buy exactly what you need when you need it.While you're not wrong there is the mentality that "if it's affordable why not get it?" as well as future proofing against the ever increasing graphical requirements and advancements. I do agree that 4K is a bit of a meme, but 1080p144hz is increasingly achievable.