AMD has somewhat better drivers for Linux, so I guess that's something.
AMD has somewhat better drivers
for Linux, so I guess that's something.
Who'd have thought if you release specs to the open source community they make your proprietary drivers look like a halfassed joke in about 2 years flat.
i had a 5850 a few years ago, about the time their drivers started getting good. I honestly hope AMD was embarassed. The proprietary fglrx driver only supported opengl 2 and couldn't run bioshock infinite, fast forward a couple months, the open source driver actually runs bioshock infinite, and has a big performance lead over the stuff that did run on the proprietary driver.
Even the windows drivers lately seem to be a lot better than they were in the past
AMD hasn't been competitive with Nvidia in either price/performance, raw performance, or driver quality since about 2015 or so unless you go with junk sub-100 dollar GPUs. The only time in recent memory I went with AMD was during the bitcoin mining craze when I needed a GPU and even shit cards were ridiculously expensive, and I regretted it. I really wish AMD were more competitive - I remember having an athlon 64 x2 and a radeon 9800xt back in my college days, and it kicked total ass, but as a boring adult I always go Intel and Nvidia.
eh, i think the nvidia/amd issue is a little more complex than that.
In the past it seems ATI/AMD's solution seems to be throw hardware at the problem, of course nvidia does that too, but they understand software is also important as a gpu (or any computer hardware) is fucking useless without software.
The old 5850 i used to have, despite being somewhat older was still fairly valuable because it was really fucking good at crypto mining. It had assloads of raw compute power for the time compared to nvidia.
Nvidia spends a lot of money with developers, pushing their "features" on developers and giving them cash to ensure they support hairworks/physx/rtx/etc. They were smart back around 2012-2015 partnering up with unreal and getting a lot of their tech baked into UE3, as a lot of the big titles around that time (bioshock, borderlands, etc) used unreal engine.
I think AMD is getting competitive (definitely in cpu, possibly in gpus) but more importantly right now, nvidia is fucking up.
They currently sell three goddamned variants of the 1660 regular 1660, 1660 super, 1660 ti. They're binning the chips and breaking them out into slices of the same price bracket between their own models, and are going to wind up competing with themselves.
They missed out on being the gpu in the current gen playstation and xbox (both amd), they're in the switch though.
They're getting comfortable like intel on the whole "fastest" thing, and bet a little too hard on the RTX thing.
yeah, realtime raytracing has been a dream for years, but its not fully baked. Another generation or two it may be amazing.
* I'm not 100% positive on all of this, i know a bit about graphics stuff, but am by no means an authority, this could be entirely wrong as it is based on my very basic understanding of graphics tech
*
In modern games its almost like FXAA , its a fast approximation of raytracing for the most part (small number of rays, noise filter to get rid of the gaps and average to get something that looks fairly good without tanking performance), with traditional gaming rendering tech doing the vast majority of the work and then a raytracing pass on top.
The reflections (particularly the off screen/non screen space reflections) hit hard, because you have to essentially factor in and at least partially draw shit that would normally be culled due to not being visible.
from
nvidias marketing
"Rays are cast towards reflective surfaces visible from the player’s camera, rather than from the player’s camera across the entire scene. This greatly reduces the number of rays that have to be cast, which increases performance."
This isn't ray tracing, but raycasting and PBR.
raycasting is a performance hack that sort of works backwards to raytracing. Raytracing is essentially simulating light ,starting at a light source ,hitting shit in the scene and illuminating/bouncing around, possibly bouncing off away from the screen and ultimately doing a lot of work to never be seen.
Raycasting starts at the camera/eye/each pixel on the screen and travels outward toward the scene to ensure every bit of work used is actually going to be put to use. '
Raycasting scales i think linear to screen resolution in terms of compute, raytracing scales based on (in addition to resolution) scene complexity in terms of geometry and the properties of the stuff in the scene (reflective, semitransparent (sub surface scattering), etc)
from the above it sounds like they're going through the surfaces in the scene, anything that is reflective if fires rays (more/less depending on reflectiveness of the surface) from the camera back towards the surface and out to the scene itself copying what it hits as a reflection
I remember reading bits about 2016 doom, one of the engine devs mentioned using a lot of clever caching tricks to speed things up. I'm speculating, but it could be something like keeping lower LOD information about the scene cached for quick references for stuff like reflections without having to fully redraw.
one thing i learned from playing ut3, doom 2016 and to a lesser degree rage 1 (In a bad way compared to the other two), if your game is entertaining, and fairly fast paced, You can hide texture pop-in/LOD streaming well enough that the user won't notice or care much, if you're smart about it
I think Quake 2 RT is essentially fully raytraced, but is still using the noise filter to fill in the gaps. low poly enough that its doable with a fairly high number of rays while still running and looking good.
I could be wrong, but the concurrent int/float calculation in the rtx series cards i think may wind up actually being more useful for devs, but thats not nearly as shiny as raytracing.
I know most games are going to be doing mostly floating point calculations, but being able to do floating point and integer work in the same single cycle, if used
cleverly can be leveraged to do a LOT.
its cool that amd is contributing to open standards (freesync, opencl,etc) instead of nvidia branded shit tied to licensing fees, but they need to probably start trying to get together some sort of initiative with game devs. Being that a console games are ultimately the bread and butter of most big game companies and eventually ported to pc, amd should have a good working relationship with the devs (especially since the consoles are mostly running amd hardware), but it still seems like in the PC market nvidia is targeting the developers better.
I know nvidia has done shit with the demoscene, and regularly gives presentations/talks on new graphics and rendering techniques, I'm sure amd has done some of this as well, but in my eyes they've not been as visible about it.
Amd seems to be making graphics hardware, nvidia makes graphics hardware but seems to be putting just as much into pushing graphics software forward.