GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Oh, I mentioned that. It's available on console(s) which is a huge boon for AMD and anything on PC that touches console(s). AMD is not as strong at having RT shit going at this point but as expected they have made considerable gains now that games that are also on consoles are using their hardware implementation. Nvidia was the only game in town for a while.

Unreal Engine 5 reportedly performs way better with raytracing on AMD hardware because it (preferably) relies on a software/shader mode, ultimately it's rasterization, presumably so they don't exclude the vast majority of people that don't have RT cards because that would be dumb. This is the nature of the PC graphics card, new hardware features needs to see new hardware adoption and as of now it's not the time to make RT mandatory.

Ray tracing is a meme anyhow.

Shadow of the Tomb Raider: "Oh look, the shadows look slightly less blocky."

Cyberpunk 2077: "Oh look, now I can see semi-reflected lights in windows and the pavement looks wet after rain."

The only game I have which really pops with RTX is the boomer shooter Amid Evil, where the low-poly design helps it stand out more. Incidentally, I recommend Amid Evil. It plays awfully like the original Quake (the difficulty and level select are their own stage and the highest difficulty is hidden) but with the esoteric setting style of Heretic and also the ability to have a "super mode" where your weapons do different things like the Tome of Power in same.
 
Everything isn't a thing until it becomes common. That's the long and short history of graphics cards, so don't dismiss raytracing, yet. Getting a card only because it can do raytracing is stupid though, we're not at that point yet.
Don't dismiss raytracing until it's prominently featured on a Samsung AMD Exynos smartphone.
 
  • Like
Reactions: Allakazam223
I know the insane amount of money games on mobile makes in addition to the huge growth and importance of that market for vidya companies and I'm still going to call raytracing on a phone retarded.

Name a pay to win mobile game that is actually good, as a game. I can't.

Closest I'll go is The Pinball Arcade which is not pay to win, but play for free but pay if you want to go beyond a certain score limit, upon where you get unlimited play on that machine forever. And that's also on PC with the same model.

Unfortunately all the WMS machines were taken off sale because the licence ended. I'm still annoyed about that because I didn't subscribe to Fathom in time, and that's alongside Xenon and Gorgar and Centaur as the greatest early solid states.

Speaking of which, pinball machines are the only valid use of RGB LEDs. Prove me wrong.
 
I know the insane amount of money games on mobile makes in addition to the huge growth and importance of that market for vidya companies and I'm still going to call raytracing on a phone retarded.
RDNA2 will come to low-power Rembrandt (12 CUs) and ultra-low-power Van Gogh or Dragon Crest (8 CUs?). These should have 1 ray accelerator per CU, the same as discrete cards. The RX 6900 XT has 80 of them.

For the Exynos, they will customize it, and maybe add in some RDNA3 features. I think Samsung's 5LPE is similar to TSMC's N7, so not much difference there. The raytracing capability could be limited to small parts of a scene, like puddles. Samsung will probably have to sponsor select titles since mobile devs otherwise wouldn't care. It could also support FSR to upscale for more performance.

Eventually, all new smartphones will support raytracing just as all PC games will move away from rasterization. If you can't see the benefits of raytracing on small screen sizes, you could dock the phone with a TV or monitor. And that's the end of my defense of this "feature". The real selling point would be AMD graphics beating Mali graphics.
 
Name a pay to win mobile game that is actually good, as a game. I can't.
Fire Emblem Heroes was pretty good for the first year or two

Speaking of which, pinball machines are the only valid use of RGB LEDs. Prove me wrong.
It's cheaper and easier than ever to add lighting underneath cabinets now, and you can set RGB LEDs to just be white
 
Ray tracing is a meme anyhow.

Shadow of the Tomb Raider: "Oh look, the shadows look slightly less blocky."

Cyberpunk 2077: "Oh look, now I can see semi-reflected lights in windows and the pavement looks wet after rain."

The only game I have which really pops with RTX is the boomer shooter Amid Evil, where the low-poly design helps it stand out more. Incidentally, I recommend Amid Evil. It plays awfully like the original Quake (the difficulty and level select are their own stage and the highest difficulty is hidden) but with the esoteric setting style of Heretic and also the ability to have a "super mode" where your weapons do different things like the Tome of Power in same.

Minecraft looks fairly impressive with RTX too, again because it's being compared to a non existent lighting system in the first place.

We've gotten so good at doing fake lighting that RTX is barely an improvement and surely not worth the performance cost.

I think the biggest benefit might be saving developers time once it becomes standard.
 
Here's a pretty cool thing, a custom board combining a Voodoo 3500 with a PowerVR PCX2. Why? idk but it's neat.
voodoopowervr.jpg
Link to the unboxing video because that's all there is right now. Don't click it, it's an unboxing video. There's a jumper for the PCX2 to run it at 80mhz instead of 66, now you know.

Here is an absolutely atrocious Tomshardware article about it, I swear it must have been written by a bot.

What made the Voodoo 3 such a great card at the time was its excellent performance that was on par with every other top performing graphics card in 1999 and great compatibility with a ton of games on the market as well. Other graphics cards competitors did have some advantages like HD texture capabilities, but overall the Voodoo 3 was a great all-around card that did everything you wanted it to do.

The 2nd chip is an older PowerVR PCX2 chip that was originally made in 1997. This card was designed primarily as a 3D-only accelerator board and would have to be used in conjunction with a 2D video card in your system. Being 2 years older, the PCX2 is significantly slower than the Voodoo 3, but its age will allow the card to play games that were made in 1996 and 1997 like the original Tomb Raider.

This card is definitely not for the masses, but one for gamers who love to play titles from the mid-to-late 90s, like Quake, Wing Commander, and Tomb Raider. These games are so old that they will not work at all on any type of modern hardware, which is where cards like these come into play.
What the fuck. ...yeah the game is from '97 so you want to use PowerVR SGL because it might not support 3dfx Glide.
 

According to the rumor mill, there will be 128-core Zen 4 Epyc, as well as a 12nm backport of Zen 3 for budget Athlon chips. That would be on GlobalFoundries' improved 12LP+ process and have 4 cores with a tiny amount of RDNA2 graphics cores. This could be the last gasp for AMD at GlobalFoundries unless they come up with better nodes, and it could make for some great budget laptops. Maybe they can get the same IPC increases on this node but at lower clock speeds, which is fine if it's in $100 products.
 
I might do up a Threadripper based system for comparison but my feeling is you'd be adding on $500 and I don't feel you'd lose too much based on my reading the requirements from UE.
Even a low end 3955WX threadripper is about 700 CDN more than a 5800x, not including mobo or ECC RAM, or a specialized TRx4 CPU cooler. 5950x @ 16c/32t could be a better choice. Same core and thread count, AM4 mobo instead of TRx4 mobo. Cheaper than 3955WX.

I would put more than a Hyper 212 on anything used for production though, probably get a big case to fit a NH-D15, better to run a larger heatsink and fan lower and cooler. You do not want thermal throttling while trying to render.

AFAIK, SN850 have been having really bad PR lately, due to data getting deleted. Check out Samsung Pro 980 for around the same $/GB.
 
Even a low end 3955WX threadripper is about 700 CDN more than a 5800x, not including mobo or ECC RAM, or a specialized TRx4 CPU cooler. 5950x @ 16c/32t could be a better choice. Same core and thread count, AM4 mobo instead of TRx4 mobo. Cheaper than 3955WX.

I would put more than a Hyper 212 on anything used for production though, probably get a big case to fit a NH-D15, better to run a larger heatsink and fan lower and cooler. You do not want thermal throttling while trying to render.

AFAIK, SN850 have been having really bad PR lately, due to data getting deleted. Check out Samsung Pro 980 for around the same $/GB.

We haven't heard back from @SeniorFuckFace on this for a while. My guess is he's put it on hold until GPU prices get a bit saner. Which I don't think anyone would blame him for - they're insane and if you hadn't been following GPU news you wouldn't know what you were getting yourself into. I think we specced up something decent all things considered but GPUs cost a lot right now.
 
We haven't heard back from @SeniorFuckFace on this for a while. My guess is he's put it on hold until GPU prices get a bit saner. Which I don't think anyone would blame him for - they're insane and if you hadn't been following GPU news you wouldn't know what you were getting yourself into. I think we specced up something decent all things considered but GPUs cost a lot right now.
I called in a favor and bought me a system instead... :)

The deal was a miracle and thank you for all your help.

20210704_121602.jpg
 
I called in a favor and bought me a system instead... :)

The deal was a miracle and thank you for all your help.

View attachment 2316271

Some favour! Got to say I like the chan style username proof.

I hope it didn't break the bank. Looks like a respectably beefy system. I expect based on the description of its intended user another HD will get thrown in there someday.

Glad we could help and glad you got something that can do the job in the end.
 
Some favour! Got to say I like the chan style username proof.

I hope it didn't break the bank. Looks like a respectably beefy system. I expect based on the description of its intended user another HD will get thrown in there someday.

Glad we could help and glad you got something that can do the job in the end.
Yes, ports galore, storage is good for now and $3k in total for the system.

I was told the GPU costs more than that...crazy.
 

According to the rumor mill, there will be 128-core Zen 4 Epyc, as well as a 12nm backport of Zen 3 for budget Athlon chips. That would be on GlobalFoundries' improved 12LP+ process and have 4 cores with a tiny amount of RDNA2 graphics cores. This could be the last gasp for AMD at GlobalFoundries unless they come up with better nodes, and it could make for some great budget laptops. Maybe they can get the same IPC increases on this node but at lower clock speeds, which is fine if it's in $100 products.
They could really use the extra business. Globalfoundries is in hot water now with IBM, who paid GF to take over and upgrade their fabs in Fishkill and elsewhere iirc, and they very recently filed a lawsuit over their failed roadmap promises AND selling off plants that they were given instead of maintaining them.

It was a noble effort of IBM to try to make GF a TSMC alternative.
 

Fake 16-core beating real 16-core in multi-threaded? Big if true. But it will be hot, and the performance will depend on using DDR5 or DDR4. It supports both but the cheaper systems are going to be on DDR4.

AMD will probably respond within a couple months by using 3D V-Cache.
 
There are shades of old Furmark when Amazon's MMO is allegedly frying 3090's. "players are theorizing that there's not an FPS cap on the menu screens, causing GPUs to render 9000+ FPS"
amazonmmo.JPG

amazonmmo1.JPG

 
Back