GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

The Caveman just shows the cards and tells the prices in the first five seconds of the video while the AMD presentation keeps moping around about it lmfao

I'm gonna be not negative about AMD cards this time. TH puts the 7900 GRE raw performance about even with the 4070 Super, which currently sells for about $1000. If the 9070 is 20% better than that, that would put it close to the 7900 XT, which currently sells for upwards of $1200. It looks like FSR 4 will finally be a usable solution, so at $599, these things are going to sell out immediately. Actually finding one at $599 will be like finding a B580 at $249 (which again, the issue actually seems to be Ryzen 5000's PCIe chipset)

"Neural rendering ready" suggests there are already games in development, and they got the memo from the publishers that they'd better have a solution soon or get left completely in the dust.
 
I'm gonna be not negative about AMD cards this time. TH puts the 7900 GRE raw performance about even with the 4070 Super, which currently sells for about $1000. If the 9070 is 20% better than that, that would put it close to the 7900 XT, which currently sells for upwards of $1200. It looks like FSR 4 will finally be a usable solution, so at $599, these things are going to sell out immediately. Actually finding one at $599 will be like finding a B580 at $249 (which again, the issue actually seems to be Ryzen 5000's PCIe chipset)

"Neural rendering ready" suggests there are already games in development, and they got the memo from the publishers that they'd better have a solution soon or get left completely in the dust.
It seems that AMD finally fixed the failure where they tried using generic ROPs for upscaling and RT and instead focused hard on the AI and RT features putting actual dedicated hardware.
 
  • Like
Reactions: Nameless One
Did Intel stop producing the B580 already? Its out of stock everywhere.
Every unit sold loses them money. Source: I made it the fuck up.

AMD Strikes Back at NVIDIA’s RTX 5070 Ti, Says Radeon RX 9070 XT Offers Same Native & RT Performance For $150 Less

Many of the %s AMD has been using have combined raster/RT to make it look better. Here they are actually comparing to the 5070 Ti, claiming -2% average 4K performance across "30+ games", or +2% for overclocked models (which should mostly use a 340W TDP, 12% higher power than the 304W reference number).

AMD-Radeon-RX-9070-XT-vs-NVIDIA-GeForce-RTX-5070-Ti-_1-1456x819.png
 
  • Like
Reactions: Nameless One
It seems that AMD finally fixed the failure where they tried using generic ROPs for upscaling and RT and instead focused hard on the AI and RT features putting actual dedicated hardware.
Intel allows you to use the generic vector units on AMD & older NVIDIA cards for upscaling with XeSS, and it looks much better than FSR. The problem with FSR is that it's not inference based; it's heuristic based. I could write up a detailed post about this, and may later. The main thing is that inference-based upscaling incorporates information computed offline and embedded in a model. Heuristic-based upscaling uses only the immediately available information and uses what amount to "rules of thumb" to guess what the missing information is. The latter is intrinsically prone to high-frequency aliasing, causing sparkle and shimmer, and is completely unable to reconstruct missing detail, which is why text looks terrible.

Did Intel stop producing the B580 already? Its out of stock everywhere.

The moment one comes available, it's bought by Chinese scalpers who turn around and sell it for $375-$400 on Newegg. Sparkle's about to launch a new variant, though.


Maybe you can grab one for the ten minutes it will be in stock.
 
Did Intel stop producing the B580 already? Its out of stock everywhere.
The rumor is that they only made it because they'd prepurchased the capacity from TSMC and had to get something out the door to help their image. If you look at the size of the dies, the process they're on, and what AMD has been charging for similarly-sized dies with the same process node on their cards, then it seems very likely that Intel is either outright losing money on the card or maybe breaking even on production costs.

We don't know if we're gonna get more Battlemage stock but Intel itself has already moved on to Celestial in hopes of 18A giving them a node advantage over TSMC and therefore letting them actually compete with AMD and Nvidia without needing to set money on fire.
 
don't know if these are rumors but apparently AMD has been building stock for the last two months so its not a "paper launch" like NVIDIA, and while that did sound like bullshit, the fact that microcenter literally texted me out of the blue saying it will release next week makes me feel like its not bullshit. Microcenter hasn't done this for me whatsoever in the decade they've had my number on file either so this is super insane.
 
don't know if these are rumors but apparently AMD has been building stock for the last two months so its not a "paper launch" like NVIDIA, and while that did sound like bullshit, the fact that microcenter literally texted me out of the blue saying it will release next week makes me feel like its not bullshit. Microcenter hasn't done this for me whatsoever in the decade they've had my number on file either so this is super insane.
I'm ready to upgrade my GPU with whatever the next sub-200W thing is that has AI upscaling and generally performs at least as well as this one on raw raster.
 
Prices are good. People will still bitch and moan that AMD isn't giving them a 5090 for $400 though so what else is new.
I think reception to the 9070 XT price has been pretty good aside from a few that want to take a trip back to 2015. The 9070 non-XT price is AMD's typical 1D checkers move. Street pricing is what actually matters but the boxes have been collecting dust in Best Buys, Micro Centers and warehouses.
I'm considering going from 3060 Ti to rx9000. Would it be worth it? Do I take the red pill? Or maybe a sapphire?
Over double the performance, double the VRAM, and hope that FSR4 takes off.
 
  • Like
Reactions: Nameless One
I'm considering going from 3060 Ti to rx9000. Would it be worth it? Do I take the red pill? Or maybe a sapphire?
Yeah it'd probably be a considerable improvement aside from using ~30% more power. I wouldn't buy the non-XT though if you can afford it, it just looks like a bad deal.
 
and hope that FSR4 takes off.
I don't use AI slop much or frame gen. I want and expect a stable true 1080p 60fps experience *without temporal antialiasing solutions if possible*. I just wish developers optimized better. We have some shit like MGSV thats smooth as fucking butter in 2015 even on shit brick computers and then we have unreal engine devs who don't know shit and choke even with top of the line hardware. Reliance on frame gen and image upscaling is a blight on the pc master race. If 1080p looks like a blurry mess just to maintain a stable 60fps whats the point?
Aside from using ~30% more power.
Thanks for telling me. I'm not getting it then. If I want a heater I'll just buy a 50 dollar one. If I want a power draining bottlenecking mess I'd just play modern unoptomized GPU/CPU reliant vidya that'll lobotomize my computer.
My 3060 Ti already makes a room toasty.
 
Last edited:
Should Steve "Notorious Caveman of North Carolina" Burke get a haircut?
The fact that he's kept it so long as a married man means his wife is probably into it so I'm gonna say no.

I don't use AI slop much or frame gen. I want and expect a stable true 1080p 60fps experience *without temporal antialiasing solutions if possible*. I just wish developers optimized better. We have some shit like MGSV thats smooth as fucking butter in 2015 even on shit brick computers and then we have unreal engine devs who don't know shit and choke even with top of the line hardware. Reliance on frame gen and image upscaling is a blight on the pc master race. If 1080p looks like a blurry mess just to maintain a stable 60fps whats the point?
The 9070/XT (either really) are going to be overkill for 1080p for the most part. You're going to be getting 120+ fps easily unless you have a ludicrously weak CPU or you're playing some incredibly poorly optimized game like Alan Wake or Monster Hunter Wilds.

You should still probably be interested in FSR4 though, because if it has an equivalent to DLAA then that'll be way better than TAA at 1080p.
 
  • Dumb
Reactions: Nameless One
The fact that he's kept it so long as a married man means his wife is probably into it so I'm gonna say no.


The 9070/XT (either really) are going to be overkill for 1080p for the most part. You're going to be getting 120+ fps easily unless you have a ludicrously weak CPU or you're playing some incredibly poorly optimized game like Alan Wake or Monster Hunter Wilds.

You should still probably be interested in FSR4 though, because if it has an equivalent to DLAA then that'll be way better than TAA at 1080p.
These are 1440p cards.
On a side note Can we go one day without muh optimization and austisicly hating on new tech?
 
Back