GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

The only other games I have that use ray tracing are Shadow of the Tomb Raider and Amid Evil (an EXCELLENT boomer shooter btw), and in the former it isn't all that big a difference, but in the latter it does really pop in the low-poly art style.
It's funny how that works. In my experience, slightly more realistic lighting and reflections do very little for scenes in more realistic art styles. Probably because they are not enough to compensate for all the other flaws and imperfections that make your brain cry: "This is not real!" But they have huge effect in scenes with simple and more abstract art styles and can make them seem much more real and believable.
 
Epic recently rolled out native support for DLSS in UE and Unity will probably do that as well. Developers that make their own engine from scratch know enough to implement it on their own, they have skilled people, making an engine is hard.

Welp, if true, AMD really need to stop fucking about and get the ball into the open goal Nvidia have left for them. Why they can't just delay the Radeon 6700 XT to build up stock levels given that Nvidia is having difficulty with its yields I don't know. That, and they need to get their version of DLSS, called Fidelity FX, into the next round of driver updates.
 
Shit, I was going to post that.
The editorial department was able to confirm the fact that restrictions could be circumvented by interacting with related parties. The specific method is not introduced here, but it does not require modification of the driver or BIOS, and anyone can easily do it with a little effort.

— PC Watch
Jesus christ that's a massive oopsie. It almost feels like those old "this is for homebrew, not piracy" flash carts. Or advanced tobacco smoking rigs(bongs) or OCB for the people that like to roll long cigarettes with rolled up cardboard filters.
 
Welp, if true, AMD really need to stop fucking about and get the ball into the open goal Nvidia have left for them. Why they can't just delay the Radeon 6700 XT to build up stock levels given that Nvidia is having difficulty with its yields I don't know. That, and they need to get their version of DLSS, called Fidelity FX, into the next round of driver updates.
Honest question: I've always been an AMD guy but is there any reason really that I should be looking elsewhere beyond the fact that supply is so limited right now and if you can get something you should take it? I've been eyeing up 5700s and 6700s (emphasis on eye because lol good luck finding any) but aside from this thread I haven't paid much attention the AMD vs NVIDIA vs INTEL conversation.
 
Honest question: I've always been an AMD guy but is there any reason really that I should be looking elsewhere beyond the fact that supply is so limited right now and if you can get something you should take it? I've been eyeing up 5700s and 6700s (emphasis on eye because lol good luck finding any) but aside from this thread I haven't paid much attention the AMD vs NVIDIA vs INTEL conversation.
They all will cost a stupid amount of money. If you really must, I'd just grab whatever you can get your hands on. It's a sellers' market, so you don't really get to be picky unless you buy from a scalper.
 
  • Agree
Reactions: Allakazam223
It's funny how that works. In my experience, slightly more realistic lighting and reflections do very little for scenes in more realistic art styles. Probably because they are not enough to compensate for all the other flaws and imperfections that make your brain cry: "This is not real!" But they have huge effect in scenes with simple and more abstract art styles and can make them seem much more real and believable.
Current form of raytracing is nothing but a feature add on to justify price increases and to reel in consoomers.
 
Honest question: I've always been an AMD guy but is there any reason really that I should be looking elsewhere beyond the fact that supply is so limited right now and if you can get something you should take it? I've been eyeing up 5700s and 6700s (emphasis on eye because lol good luck finding any) but aside from this thread I haven't paid much attention the AMD vs NVIDIA vs INTEL conversation.

In terms of gaming performance I don't think there's any reason to choose Nvidia over AMD and in fact I'd say AMD wins at both power-performance and cost-performance ratios. Not that anyone spending £500+ on a GPU probably cares about power consumption! Where Nvidia has clear gains is the software ecosystem outside of gaming. Some nice software support for background removal on camera, sound studio stuff and Machine Learning and certain rendering technology. I do actually think Ray-Tracing looks nice but don't think it's worth the current performance hit you get for it. So the answer, imho, is "no" if you're a gamer, "a bit" if you care about other things and "yes" if you want to use certain rendering engines and similar. But if the last one were true for you, you'd likely already know it.

So anyway, how's stock on the 6700 looking? AMD were claiming you'd actually be able to buy this one.
 
  • Informative
Reactions: Brain Problems
Even if they were available, I'm priced out of them now. I mean, yes - I could afford one. I could afford a 3090 if I wanted to. But a grand+ on what is, at the end of the day, a toy? Just no. If I'm looking to spend that much on leisure time, I could buy. 150 books I've never read. Buy second-hand? 500 books. That's 1,000's hours worth of leisure time. Compared to what it costs me just to get to the position I'm able to start spending money on PC games. And okay, the 3090 is a hyperbolic example but with base costs starting at around £500 the point is still valid for me at that price. It's just a lot of money for some occasional PC gaming to relax.

At the moment I'm either sitting this generation out and seeing what RDNA3 is like or I'll get this generation but a year from now when I can get them cheap second-hand if that ever happens (it hasn't this time).
 
Most of all, most AAAAAA whatever games which actually take advantage of the features of these cards are terminally boring at best or pure broken sewage at worst and dropping a grand on what amounts to playing one specific game is nuts and I'm too old for that crap. I kinda noticed things were kinda fucked up when I stayed with the R9 390 I used to own for way too long and barely ever used the performance it had to offer for anything. I now have one of the OEM Pro Ryzen APUs that were available for five minutes in November or October or so and it runs most of the indie and older games worth playing just fine. I could see myself sticking to APUs in the future, if this thing would have just a smidgen more of graphics performance it'd cover that bit of stuff that's not quite playable yet too. Wonder what this will do to the PC gaming market in the long term.
 
Wonder what this will do to the PC gaming market in the long term.
It completely blocks adoptation of the new features they offer. They can build console games where RT is a core feature, but they can't require RT on PC for the port if people can't buy cards supporting it. RT and other stuff will be downported from consoles to PC and I suspect the situation will be similar to the early 2000's where PC initially didn't support what consoles could do so ports had a hacked together renderer that ditched or scaled back some graphical features.

Bioshock(not related to anything I wrote above) had a cool moment where a shadow swept over a wall but you couldn't really tell what it was so you carefully made your way around the corner... let's imagine that someone does a similar thing now with RT reflections on a spinning mirror hanging from the ceiling and you catch a glimpse of something, it's a nice little touch that adds to the game. That's not going to work on most PCs so they have to either ditch it or hack together a rasterizer solution with, idk, POM cubemaps and then render the critter into that or something. It will look a bit jank. Now add a 100 different features and little touches in the game that they have to re-work into something crappier on PC and it looks pretty dire.

In the early 2000's feature parity in hardware was quickly achieved. It will probably take way longer this time.
 
  • Informative
Reactions: Brain Problems
It completely blocks adoptation of the new features they offer. They can build console games where RT is a core feature, but they can't require RT on PC for the port if people can't buy cards supporting it. RT and other stuff will be downported from consoles to PC and I suspect the situation will be similar to the early 2000's where PC initially didn't support what consoles could do so ports had a hacked together renderer that ditched or scaled back some graphical features.

Bioshock(not related to anything I wrote above) had a cool moment where a shadow swept over a wall but you couldn't really tell what it was so you carefully made your way around the corner... let's imagine that someone does a similar thing now with RT reflections on a spinning mirror hanging from the ceiling and you catch a glimpse of something, it's a nice little touch that adds to the game. That's not going to work on most PCs so they have to either ditch it or hack together a rasterizer solution with, idk, POM cubemaps and then render the critter into that or something. It will look a bit jank. Now add a 100 different features and little touches in the game that they have to re-work into something crappier on PC and it looks pretty dire.

In the early 2000's feature parity in hardware was quickly achieved. It will probably take way longer this time.

Except aren't the PS5 and Xbox Series X also being hit by scalpers and shortages?
 
And all console games (for PlayStation and Xbox) still have to be backwards compatible to the 8th generation consoles. Which is why I don't expect any technical innovations from console games for the foreseeable future. Never mind that all first party Xbox games now also have to be compatible with Windows 10.
 
We'll look back at Ryzen like we do Sandy Bridge. Hopefully AMD continues to innovate chip design and doesn't pull an Intel. On a related note, I picked up a new M1 Mac Mini and I fucking love it. It's my main multimedia machine now and it's way overkill but it's just so cool. If you're going to do SOC, this is the perfect price point and target performance. It's powerful enough to be a borderline video / graphics machine but it's cheap enough to where if it breaks eh, you're not out $2500 like a new Macbook Pro. I'm curious to see what programmers can do with the M1 in the upcoming years. I've been an ARM (and Apple) skeptic but this chip delivers. The memory management is especially impressive.
 
We'll look back at Ryzen like we do Sandy Bridge. Hopefully AMD continues to innovate chip design and doesn't pull an Intel. On a related note, I picked up a new M1 Mac Mini and I fucking love it. It's my main multimedia machine now and it's way overkill but it's just so cool. If you're going to do SOC, this is the perfect price point and target performance. It's powerful enough to be a borderline video / graphics machine but it's cheap enough to where if it breaks eh, you're not out $2500 like a new Macbook Pro. I'm curious to see what programmers can do with the M1 in the upcoming years. I've been an ARM (and Apple) skeptic but this chip delivers. The memory management is especially impressive.
Eh, M1 seems great as long as you're willing to play inside of its walled garden. Or if you can tolerate Apple.
 
We'll look back at Ryzen like we do Sandy Bridge. Hopefully AMD continues to innovate chip design and doesn't pull an Intel. On a related note, I picked up a new M1 Mac Mini and I fucking love it. It's my main multimedia machine now and it's way overkill but it's just so cool. If you're going to do SOC, this is the perfect price point and target performance. It's powerful enough to be a borderline video / graphics machine but it's cheap enough to where if it breaks eh, you're not out $2500 like a new Macbook Pro. I'm curious to see what programmers can do with the M1 in the upcoming years. I've been an ARM (and Apple) skeptic but this chip delivers. The memory management is especially impressive.
500 Apple Points™ have been added to your Apple Account™ goyim, use them wisely
 
We'll look back at Ryzen like we do Sandy Bridge. Hopefully AMD continues to innovate chip design and doesn't pull an Intel. On a related note, I picked up a new M1 Mac Mini and I fucking love it. It's my main multimedia machine now and it's way overkill but it's just so cool. If you're going to do SOC, this is the perfect price point and target performance. It's powerful enough to be a borderline video / graphics machine but it's cheap enough to where if it breaks eh, you're not out $2500 like a new Macbook Pro. I'm curious to see what programmers can do with the M1 in the upcoming years. I've been an ARM (and Apple) skeptic but this chip delivers. The memory management is especially impressive.
Base model is 700 dollars.
 
  • Agree
Reactions: Dick Justice
Base model is 700 dollars.
I'm looking at broadly similar miniPCs like the Asus PN50 which will- before you spend the extra money to add RAM and storage, which, you know, come with the Mac mini- come in at $150 or so less at best.

The M1 is faster in single and multi-threaded CPU tasks. And it destroys AMD and Intel integrated GPU options.

And.. and I know this is an alien notion to people who want to buy a fast Ryzen or any sort of modern GPU... it is available to buy. At RRP.

EDIT: I will be the first in line to throw Tim Cook off a tall building, but Apple has some good engineers, and they absolutely killed it on the M1. I am really interested to see how they do in regards allowing for off-package memory (I bet they can stretch to 32gb from 16gb in a bigger footprint, but who knows what that would do to yields- although the CPU is not actually in the same chip at the memory in the M1, just soldered onto the same package) and external GPUs in the successors. I think that they probably need to do both to allow for certain AV related tasks, the only question is what it will do to memory access speed, which is obviously the source of a lot of their really stellar metrics from the M1.
 
Last edited:
Back