GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Does this mean I have a chance?
VideoCardz: AMD leaves the door open to FSR 4-like support on RDNA 3 (archive)
Adam Patrick Murray (PCWorld) was not satisfied with that response and demanded to know why AMD is unable to allow users to test FSR4 on other GPU series. Zdravkovic said AMD is not against modders who hack pieces of Redstone onto older cards. Zdravkovic also said a formal beta track for RDNA 3 is not planned today. However, Zdravkovic reacted positively to the idea and asked for suggestions on how to frame it as an opt-in experiment, with clear expectations.

That is where an “FSR4-lite” idea can make sense. AMD could ship a separate, clearly labeled experimental path for RDNA 3 that focuses on what works acceptably, game by game, and leaves the rest off. The point would be user choice, plus official packaging that avoids sketchy downloads and reduces confusion.
 
My ASUS ROG Astral GeForce RTX™ 5090 32GB GDDR7 OC Edition is coming in tomorrow , Should I keep or return my ASUS Prime Radeon™ RX 9070 XT OC ?
Underclock the overclocked 5090. I have no advice regarding the other card, seems like something you should know. Maybe make a gaming PC for your nephew (underclock that too).
 
My ASUS ROG Astral GeForce RTX™ 5090 32GB GDDR7 OC Edition is coming in tomorrow , Should I keep or return my ASUS Prime Radeon™ RX 9070 XT OC ?
If you want to satisfy your inner bossman jack gamba gamba brain... you could hold on to it and resell it when prices spike, which I think is pretty likely in the next few months to be honest.
 
If you want to satisfy your inner bossman jack gamba gamba brain... you could hold on to it and resell it when prices spike, which I think is pretty likely in the next few months to be honest.
This was what I was thinking , Maybe it will be rare, the lack of new products announced at CES by AMD , Nvidia and Intel is telling.


ratJAM.gif
 
jeettube recommended me this niktek video after 3 days, but also recommended me vex kneeling to the clankers.

while people can argue about fake framing, i have to say that once you start to notice the shitty blurriness and the pixeling that happens when the algorithm is upscaling the texture, your eyes get accustomed to it, sure it's related to poor upscaling implementation but you will be able to quickly spot and see the fucking textures spazzing out to become "high definition" and it puts you off, also UE both 4 and 5 has this entire dotting effect in order to overlay stuff that is something you begin to notice too, while it kind of beats stuff popping in and out it does look offputting, i guess that's the price of games getting rid of fog.
 
Finally got my power cables for my 5070ti and had time this weekend to install it and get it running. My SDXL gen speeds went from 50 seconds down to 10 which is amazing, except for the fact that my PC shut down twice on me when I boot up reForge. Guess my 600 watt PSU isn't enough even with my old ass CPU. And now buying a new PSU has led me to buying a new mobo and CPU because these prices have me spooked *sigh* Wondering if I should bother picking anything else up at Microcenter.
 
UE both 4 and 5 has this entire dotting effect in order to overlay stuff that is something you begin to notice too, while it kind of beats stuff popping in and out it does look offputting, i guess that's the price of games getting rid of fog.
dithering?
dithering 3.jpg

transparency in a deferred rendering pipeline is ridiculously expensive because parts of the field being rendered are being done concurrently and separately from each other, so you have a depth map but no direct information about what color or texture is above or below the polygons being rendered. you have to do a second pass after the lighting and textures have been calculated to apply a shader to that object to gather the pixel information from the rendered scene and then apply it to the object if you want transparency. As opposed to the direct x 9 style of doing things where the scene is rendered back to front so you can just use an alpha channel that reads the pixel value of the pixel "below" the object to get true transparency without even using shaders. Or you just use the tried and true "one-bit dithering" as pioneered by the commodore amiga in the 1980's


fuck deferred rendering all my homies hate deferred rendering
 
Bro you can’t just say you upgraded your rig without dropping some specs! We won’t laugh at you, we promise!
GPU is a RX 9070, CPU is a 9800X3D, RAM is 32 GB of DDR5 Crucial at solid specs. Figured to get the CPU on a recommendation from a friend and it's good for some locally ran stuff for work (when we need to do basic testing before putting it up on our high performance computing network). Plus I just wanted to get something that'll last me 10 years just like my current rig. Now just waiting on the RAM and the new cooler mount to be delivered and then I can get it all put together.
 
They act like it hasn't been done already with XeSS. Yes, it ran a tad slower than FSR because it was doing inferencing instead of heuristics, but it looked way better and nobody was shitting their pants over it.
Wait, is XeSS... Decent now? I don't think I've considered it since it came out and it didn't impress me but if you guys say it's worth investigating I won't just throw it away.
 
Back
Top Bottom