Nvidia RTX series

Isn't ray-tracing how Wolfenstein did it?

I can't remember what the docs say specifically, but ray tracing has been a known technique for a long time, it's just we never had the right processor designs to support it outside the CPU (so we can't do anything impressive with it because it's too slow). These new designs natively accelerate ray tracing and take full advantage of the technique.

Wolfenstein used a sort of sweeping technique that was sort of a simplification of ray tracing. Modern ray tracing would be done from a 2D grid with the rays going out from the camera in 3D, while Wolf 3D's render used what was apparently known as ray casting, and sent rays out from the camera along a 1D line in 2D in order to render vertical segments of the environment with faked perspective. Wolf 3D is pretty much a 2D game, just as DOOM is; clever math and camera tricks are what make it seem as though it isn't.

Here's a gif demonstrating it curtesy of wiki


And here's a video demoing it in action
Note that the sprites use the painterly technique for rendering instead, as the environment is probably stored in an awkward to modify acceleration structure, and shouldn't be mixed up with things that move around a lot like soldiers.

The scope of ray tracing in Wolf 3D was vastly constrained in order to make it practical to run on the computers of the time, but it's definately similar to ray tracing. Ray tracing typically involves a lot more complexity though, things like rays bouncing, refracting, or creating more rays are typical. These days if you want to render something like Wolfenstein you'd just use rasterization, as ray tracing is typically overkill for something so simple, and constraining it à la Wolfenstein makes your renderer efficient, but inflexible.


-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

I'm probably going to get an high end RTX within the next couple of years once I have some more income. Definately going to wait on it though till the prices are down, and the kinks are worked out.
I don't get why people preorder these things, I get that it's exciting --and I am quite excited-- but it's probably going to have bad support on launch, and there'll be only like 3 games that utilize it; why pay extra for that unless you are a developer or have cash to burn?
 
You wouldn't get the new RTX cards just for a few shinny addons, that's not what people picked up 980's or 1080's for. (There are companies out there that do hair better than those cards could). You get it due to the performance boost. There are only going to be a few games out there that will make ray tracing look good, those will be games that have a lot of lighting effects and materials on screen that they want to look good.
 
I'm gradually buying upgrade components, and I'm wondering if I should use this oppurtunity to hope the 1080 goes down in price, though everyone seems to be thinking that so there'll probably be a price spike if anything.

I bought an i5 8600K (god please don't tell me I fucked up) but other than that I'm trying to be flexible with components.

It's a pity in the UK everything is retardedly expensive PC or gaming wise.
 
I'm wondering if I should use this oppurtunity to hope the 1080 goes down in price,
I'd say go for it but buy new if possible. I wouldn't trust used cards with the bitcoin mining craze that went on.

I'm rocking a 1080 ti with an i5-6600k and I'm still getting 130-144fps in games like Witcher 3 at ultra settings at 1080p (I'm getting a 4K monitor later this year pls no bully my pleb resolution), you'll be fine.
 
I mean, a lot of the increase is down to crypto mining, right?

They invented those thingamajigs that are better for dedicated mining, but any impact remains to be seen.

I reckon you'll have lots of people who just have to have the newest thing, and they'll totally buy these new cards now, but until there's a massive change in how good it is, and that massive change becomes cheap, most people will probably wait awhile.

(I'm getting a 4K monitor later this year pls no bully my pleb resolution), you'll be fine.

My monitor is so bad that there's almost no point in actually upgrading to be honest.

The only thing I'm wondering more about choice wise is the mobo as the CPU isn't compatable with my current one. I mean, mostly I don't care- does anyone have any recommendations?

I'd love to swap the case out because it weighs A GRAND NUMBER OF POUNDS, but it's also very good at sound dampening so I'm not going to bother with that I don't think.
 
I'm gradually buying upgrade components, and I'm wondering if I should use this oppurtunity to hope the 1080 goes down in price, though everyone seems to be thinking that so there'll probably be a price spike if anything.

I bought an i5 8600K (god please don't tell me I fucked up) but other than that I'm trying to be flexible with components.

It's a pity in the UK everything is retardedly expensive PC or gaming wise.
They have been going down in price, at least in Canada. UK and Europe is great for PC parts...if you're a builder. I've seen cheap systems sell for thousands there before you liquid cool them.
 
I'm actually kinda pumped for AMD's 7nm cards.

Will probably be cheaper and comparable - if not better according to this article - in performance.
 
Is This the Beginning of the End for Crypto GPU Price Hiking? - Gigabyte AORUS RX 580 for $209 on Amazon
by Raevenlord Thursday, September 6th 2018 16:37

This news post really does serve as a symbolic gesture for the state of the GPU market in the last months (going on to a year now, really). Mainly due to increased demand from crypto users looking to farm and mine their way to financial security and shortages of GPUs, the time to buy a new or even second-hand, latest generation graphics card hasn't been the best. However, now, we're seeing something that wasn't seen for a long time: actual decent pricing in graphics cards. It's still a process, but in that process, this Gigabyte AORUS RX 580 deal at Amazon is the proverbial pin in a haystack.

The Gigabyte AORUS Radeon RX 580 8GB (GV-RX580AORUS-8GD) is available now from Amazon for $209.99 (after a $20 mail-in rebate). If you're in the states and looking for a graphics card that guarantees 1080p performance at max settings, or a decent 1440p experience with somewhat reduced IQ, this could be the graphics card for you - and judging by its Best Seller status, it may very well be the same for many other customers. And for now, there's even stock - how about that. If you're shopping from outside the states and still find this deal mesmerizing (here in Portugal a similar graphics card would go for $410 so... yeah. Take the deal). Here's hoping this is a sign of the incoming times, and not just a freak event.

TtYkKT8E5VCBsaDE.jpg
 
Dont count on that card being for consumers. It lines up with vega 20, which is a pro only card. I dont expect any new consumer GPUs from AMD until mid to late 2019.

Ryzen is awesome. It's gaming performance is decent (lower max FPS, but better minimum FPS and lower latencies) and its productive performance embarrasses intel. If ryzen 2 comes out with another 5-10% IPC increase to match coffee lake, I'll be building a ryzen 2 rig to replace my wheezy old ive bridge build.

Either that or get OCing down pat. A current ryzen 7 2700 at 5 ghz would be a beast. Either way, watching intel get BTFOd with no products to compete is hilarious.
 
Dont count on that card being for consumers. It lines up with vega 20, which is a pro only card. I dont expect any new consumer GPUs from AMD until mid to late 2019.
Works for me, I need time to save up some money. Otherwise I'm gonna need to part with a kidney...
 
All the hardware sites are fapping themselves silly over the RTX series but pay little attention to the fact that despite being at best 40% faster than the equivalent GTX 10 series, they cost almost twice as much.
 
Hardware sites like to forget most people have budgets and don't auto buy the most expensive card they can.

But there are also the "whales" who will buy the top end RTX 2080 Ti when it arrives because they insist on having the most powerful system they possibly can. In that way, things like a top end graphics card at new, or a brand new car (buying a new car is for cucks; as soon as you get them off the forecourt they lose half their value), justify their price by the cockwaving value you can ascribe to them. Economists call these things Veblen goods. I call it the sucker tax myself.
 
HardOCP is like that. Then again they do advertise themselves as being a hardcore enthusiast site. I mostly go there for the reviews.
 
Wolfenstein used a sort of sweeping technique that was sort of a simplification of ray tracing. Modern ray tracing would be done from a 2D grid with the rays going out from the camera in 3D, while Wolf 3D's render used what was apparently known as ray casting, and sent rays out from the camera along a 1D line in 2D in order to render vertical segments of the environment with faked perspective. Wolf 3D is pretty much a 2D game, just as DOOM is; clever math and camera tricks are what make it seem as though it isn't.

Here's a gif demonstrating it curtesy of wiki


And here's a video demoing it in action
Note that the sprites use the painterly technique for rendering instead, as the environment is probably stored in an awkward to modify acceleration structure, and shouldn't be mixed up with things that move around a lot like soldiers.

The scope of ray tracing in Wolf 3D was vastly constrained in order to make it practical to run on the computers of the time, but it's definately similar to ray tracing. Ray tracing typically involves a lot more complexity though, things like rays bouncing, refracting, or creating more rays are typical. These days if you want to render something like Wolfenstein you'd just use rasterization, as ray tracing is typically overkill for something so simple, and constraining it à la Wolfenstein makes your renderer efficient, but inflexible.

Wolf3D was a very limited ray caster. Using the gif you posted as an example, and this is mostly what little I can remember from writing a similar renderer a long time ago, it was doing one ray per pixel of horizontal resolution to determine the height of that column of pixels through a LUT(take the distance from camera to where it intersected, look that up and see how many pixels should be drawn). It used that to set up how the frame would be drawn because drawing vertical lines is a no-no. That's it and that's why it is technically a ray caster while not bearing much resemblance to the traditional offline kind.

Ray casting in itself is completely different from ray tracing and both are different from rasterization.

Real-time ray tracing is something a lot of people have messed around with for a long time and in the late 90's there was a bunch of games in development that claimed to use ray tracing or other fanciful effects. Not much came from it though, a couple of demos were released, like this one from 2003:
 
Real-time ray tracing is something a lot of people have messed around with for a long time and in the late 90's there was a bunch of games in development that claimed to use ray tracing or other fanciful effects. Not much came from it though, a couple of demos were released, like this one from 2003:

That's quite impressive. Shame it never got made into a fully fledged product, though I expect if the engine had to deal with lots of moving geometry it would complicate things significantly. Perhaps that's why it remained a demo.
 
  • Like
Reactions: Smaug's Smokey Hole
That's quite impressive. Shame it never got made into a fully fledged product, though I expect if the engine had to deal with lots of moving geometry it would complicate things significantly. Perhaps that's why it remained a demo.

It was a competition/compo demo, the second of the "nature" demos, I think there were three in total. Click "show more" on youtube and you get the link to pouet where you can download it if you want to run it yourself. There have been others before it, a sort of running joke stretching back to Amiga demos, maybe even C64 demos, was "and now: real-time ray tracing!"

:autistic:
Back in ~2003 a russian guy was actually making a first person shooter that was ray traced, it was free and he was making it in his spare time. It was pretty cool to see how it was constructed, everything was built on perfect spheres because it allowed him to make assumptions(because spheres are exactly the same from every angle and ray tracers aren't bothered by non-polygonal graphics) that really sped things up. When I say that everything was made of spheres I mean everything, even the ground, so levels where planetoids.

The basic idea was similar to things like Ecstatica's ellipsoid technique or the terrible game Ballz on the SNES/Genesis. I also have the demo for an unreleased game called Seed somewhere, it did some really cool things for something that came out in '98 or so. It plays like complete shit but it was a tech demo to try to find a publisher.
If you're interested I can try to dig it out an upload it somewhere, it's only 4 megabytes or something like that.
:autistic:

So, real-time ray tracing have been the holy grail for a long time, a prestige thing, but it has been a fool's errand for a lot of reasons. It's neat that it's happening now but I don't expect too much from it. I don't know how flexible it is, haven't really looked into it, but maybe it can be used for sound reflections? That would make me interested in getting a card like that.
 
So... still running an Intel 2500k with a GTX970. Is there any reason for me to upgrade just the card?
 
Back