GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

As opposed to the direct x 9 style of doing things where the scene is rendered back to front so you can just use an alpha channel that reads the pixel value of the pixel "below" the object to get true transparency without even using shaders.
Doom also renders back to front. This way, they could bring transparency to even the earliest source ports without any issues.

Wait, is XeSS... Decent now? I don't think I've considered it since it came out and it didn't impress me but if you guys say it's worth investigating I won't just throw it away.
It most definitely is. I also remember not liking it in its early stages, but it has become a lot better.
 
It's all trade offs at the end of the day. You can have 100 million lights in a scene, or you can have accurate cheap transparency, you cannot have both.
You really can just use both at the same time, that's why crysis did. it's just a solution that requires more work than just using what's already available. 1-bit dithering is the cheapest way to do transparency in unreal engine, it's indian-tier sloppy craftsmanship. You know they aren't desperately weighing the cost of each of these effects to get the best outcome, they're just assuming people are going to use DLSS and won't be able to see it.
 
It's all trade offs at the end of the day. You can have 100 million lights in a scene, or you can have accurate cheap transparency, you cannot have both.
then you aren't hiring enough autists... probably because they are expensive.
dithering?
sometimes textures dithers, sometimes they overlay and swap the models but the dither is the standard for UE5 and it's pretty easy to notice, these jarring things is something your brain hardly forgets once you see it happening.

one thing i see falling in use is LOD level models, most of the modern games that employ them usually stick with a level 8 to 6 and 4, they don't even bother with the other levels.
 
More advanced noise models for dithering is being slept on, IMO. People only notice dithering because it’s noticeable. If the dithering techniques were better they would still be bitched about but at least people wouldn’t actually notice them during play.

Case in point early LCDs had shit dithering, people complain. Many current LCDs still have dithering but it’s better implemented and almost nobody ever talks about it.
 
VideoCardz: GUNNIR launches single-slot Arc Pro B60 BS with 24GB memory
Gunnir has listed its first single-slot Arc Pro B60 partner card, set to a 120W total board power. Intel’s own specs list the Arc Pro B60 at a 200W TBP, and also include a lower-power mode with 2000MHz clocks and a 120W to 200W TBP range.

The listed specs match the usual Arc Pro B60 configuration: 20 Xe cores, 24GB of GDDR6 on a 192-bit bus, 19Gbps memory speed, and 456GB/s bandwidth, plus PCIe 5.0. The listing also claims 164 TOPS (INT8). Intel rates the GPU at up to 197 TOPS (INT8).
The card is listed in China at 5,199 RMB, which is about $745 at current mid-market rates. It uses a single 8-pin power connector.
 
Got my 5090 last night and I installed it, it doesn't really fit though but it works...
20260112_192542.jpg20260112_192805.jpg
 
Brother, I implore you, get a atx 3.1 PSU with a native 12vhpwr cable. These GPUs already are risky enough to self immolate but don’t make it even worse by using the abomination adapter :story:
Should I do that for the 5070 or use its adapter? It's only a 250W TDP GPU.
 
Should I do that for the 5070 or use its adapter? It's only a 250W TDP GPU.
I don't think its really an issue when the power is that much lower than the rated power spec for the 12vhpwr standard, at least I havent heard of users of those cards having the melting connectors like the 40 and 5090s. Those cards can have transient spikes above the rated spec of the cable if I am not mistaken so much more prone to seppuku if the connector isnt making ideal contact.
 
GPU is a RX 9070, CPU is a 9800X3D, RAM is 32 GB of DDR5 Crucial at solid specs. Figured to get the CPU on a recommendation from a friend and it's good for some locally ran stuff for work (when we need to do basic testing before putting it up on our high performance computing network). Plus I just wanted to get something that'll last me 10 years just like my current rig. Now just waiting on the RAM and the new cooler mount to be delivered and then I can get it all put together

Brother, I implore you, get a atx 3.1 PSU with a native 12vhpwr cable. These GPUs already are risky enough to self immolate but don’t make it even worse by using the abomination adapter :story:
That adapter came with the card, so your saying I can just plugin just one 12vhpwr connector ?

I have this power supply :

RM1000x SHIFT Fully Modular ATX Power Supply

x1

It does have a dedicated 12vhpwr, I thought I had to use the adapter and plug all the power in...

Are you saying alls I have to do is just plug this power cable into my 5090 and that's it?

1768312367813.png

Edit:
I'm so mad right now ... I'm going to take the adapter out and just plug in the 12vpwr when i get home for the 5090.

The RX9070 XT I had I plugged in all three :

1768313358037.png

Could I have just plugged in the one 12vpwr cable to it and I would have been all right for the RX9070XT?

Edit: Watching this right now ...

 
Last edited:
Got my 5090 last night and I installed it, it doesn't really fit though but it works...
View attachment 8411530View attachment 8411531
Negro, use the dedicated 12V-2x6 PCIe5 plug on your power supply. Too much twisting where the wires join at the 2x6 connectors can fuck things up if the cable itself isn't up to snuff. Even the 12V-2x6 cable I got with my Seasonic 1000W PSU fucked itself with the reasonable amount of twisting I did around the back panel.

The GPU issues I described in an earlier post were probably from twisted sense pins at the GPU connector. I got a Cablemod 12V-2x6 cable and took out some separator panels between the GPU and PSU to make it's path more direct between the connectors and I haven't had the same system crashes I had before.
 
Not removing the front glass panel plastic is a gigachad move.
it comes with a plastic?
Wccftech: Micron Exclusive: Why Consumers Have Gotten the Memory Shortage Narrative All Wrong
what a pussy bitch ass nigger, customers only see prices going up and that's it, go suck clean a enterprise clanker's pipe, faggot.
also on the news, commenters are grilling wccftech and der micron faggostein, rightly so.
 
Back
Top Bottom