GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
NVIDIA Is Reportedly Pushing Retail Launch Of RTX 5060 Ti To The Same Day As Review Embargo Lift; RTX 5060 To Launch In May

GeForce GTX 970 gets new life: Brazilian modders upgrade memory to 8GB

Fixing Nvidia's mistakes 11 years on.

Medusa Point Will Reportedly Transition To A Newer And Slightly Bigger FP10 Socket Compared To Strix Point As Revealed From Shipping Manifest

FP8 used by Phoenix, Hawk Point, and Strix Point is 25mm x 40mm
FP10 is supposedly 25mm x 42.5mm, 6.25% larger than FP8
FP11 used by Strix Halo is 37.5mm x 45mm

MLID and Olrak29_ already leaked that Medusa Point would use FP10. It should be using a desktop CCD and I/O die totaling ~275mm^2, an ~18% larger die area than Strix Point, and a 325mm^2 interposer under that. Making the package larger could also lead to improved PCIe support or something.

Then we get better hardware and and you can run those effects like it's nothing but by this point they came up with something new that maybe looks 50% better for 200% the cost. Rinse and repeat and we end up with hardware that's got literally 100x the performance and little to show for it.
I am sensing a conspiracy to sell more overpriced graphics cards. I think a crocodile jacketed supervillain may be involved.
 
Last edited:
GeForce GTX 970 gets new life: Brazilian modders upgrade memory to 8GB
1742843280403.png
How is it missing ROPs and cache? Thought only Nvidia could cut that down
 
View attachment 7131130
How is it missing ROPs and cache? Thought only Nvidia could cut that down
The article didn't have an explanation, but the 1/8 missing ROPs corresponds to 1/8th of the memory being slower, so maybe they nuked its usability somehow and it's really only a 7 GB card. Or all memory can be used, but those ROPs are dropped.

Edit: No, they have just quoted the specs wrong:
 
Last edited:
  • Informative
Reactions: Brain Problems
Yeah and nowadays DX9 looks like this and will run at locked 60fps on a 1060 so what's your point?

The point is that the ripples look wrong and dissolve into a total mess of weird artifacts, that style of reflection is locked to a single plane and can't handle waves or wakes properly, and it can't handle polished surfaces reflecting back onto the water (like a boat that isn't filthy or scratched up), etc. Since this is all stuff that can be done at high frame rates on modern hardware, why should it not be done?

Why even bring up DX7 in the context of this discussion?

I dunno, I'm not the one who lied about what minimum settings on HL2 actually are. If not realizing a video of Water Hazard was running DX8 is lying, then misidentifying DX8 as minimum settings is lying. Sauce for the goose, sauce for the gander. Or would you like to agree that's not lying now?

as if you couldn't achieve convincingly looking ripples and reflections with rasterization

Faked ripple reflections are always immediately obvious. The irony here is I'm the guy with the shitty Radeon, not you, and I don't feel a sense of urgency to upgrade to a GPU that can handle raytracing. I play games with garbage screen-space shadows or wonky hybrids of planar reflections and SSR, and I have a good time. But unlike you, I don't feel moral outrage about new technologies, I'm not in total denial about what they can do, and I don't have a clear emotional attachments to the technological limitations of 2005 or 2010 or whenever you think the cutoff point for new rendering technology should be, whatever game it is you think should be where time stops (Witcher 3 or Half-Life 2, not sure). I can rationally evaluate what works well now, what doesn't, and where things seem to be going without getting GRRRRAAGHGHHH ANGEEERRRYYY that some day, perhaps soon, my 2021 GPU will be obsolete. Personally, I think it's neat that games can finally reflect water off a car door and not look like shit and don't see a reason to be pissed off about that.

Everything you say is the exact same shit I have heard about new technology since 24-bit color displaced 8-bit palettized color, probably before, which is why it's like "okay, 30 years on, people are still gonna shit their pants and breathe fire because technology didn't stop when they bought their last GPU, whatever."

You bounce from accusing people of bad faith to moral outrage to special pleading and back again, and it's all so tiresome because you are so, so, so very angry about it all.
 
Faked ripple reflections are always immediately obvious. The irony here is I'm the guy with the shitty Radeon, not you, and I don't feel a sense of urgency to upgrade to a GPU that can handle raytracing. I play games with garbage screen-space shadows or wonky hybrids of planar reflections and SSR, and I have a good time. But unlike you, I don't feel moral outrage about new technologies, I'm not in total denial about what they can do, and I don't have a clear emotional attachments to the technological limitations of 2005 or 2010 or whenever you think the cutoff point for new rendering technology should be, whatever game it is you think should be where time stops (Witcher 3 or Half-Life 2, not sure). I can rationally evaluate what works well now, what doesn't, and where things seem to be going without getting GRRRRAAGHGHHH ANGEEERRRYYY that some day, perhaps soon, my 2021 GPU will be obsolete. Personally, I think it's neat that games can finally reflect water off a car door and not look like shit and don't see a reason to be pissed off about that
You forgot to call everything slop.
 
I dunno, I'm not the one who lied about what minimum settings on HL2 actually are. If not realizing a video of Water Hazard was running DX8 is lying, then misidentifying DX8 as minimum settings is lying. Sauce for the goose, sauce for the gander. Or would you like to agree that's not lying now?
Okay so you were once again arguing with the boogeyman in your mind and not what's on your screen.
You bounce from accusing people of bad faith
I wasn't initially accusing you of bad faith, I was trying to understand why is it that you're incapable of holding an argument and answering questions that weren't asked. But if you really want it to be the case, then I'll do you a favor.

I am accusing you of arguing in bad faith. You are arguing in bad faith. Like a slimy politician that can only convince people of his bullshit if he confuses them long enough they have no idea what he said. I understand if arguing for rasterization is a conflict of interest when being an Nvidia employee, and I assume it's also against the company policy to admit being an Nvidia employee when doing your mandatory ray tracing shilling, so I have no other choice but assume that's why you're doing it.

And @Mister Fister, how about you make out with The Ugly One already? Funny how you always come up to only yes-man him like a loyal lap dog minutes after he posted, not even bothering to bring any opinions of your own.
>yeah you show those luddites how it's done
It's fucking pathetic. One user has to constantly argue about something else to look like he's in the right and the other has to wash his nuts because he can't even bring up any technical discussion.

This retardation has went on for enough pages. If you two ladies want to keep at it, be my fucking guest. I'm tapping out of this autism carousel.
 
how about you make out with The Ugly One already? Funny how you always come up to only yes-man him like a loyal lap dog minutes after he posted, not even bothering to bring any opinions of your own.
>yeah you show those luddites how it's done
It's fucking pathetic. One user has to constantly argue about something else to look like he's in the right and the other has to wash his nuts because he can't even bring up any technical discussion.

This retardation has went on for enough pages. If you two ladies want to keep at it, be my fucking guest. I'm tapping out of this autism carousel.
autism.jpg
 
While I'm waiting for my GPU, I check out the AMD subreddits and saw someone ran into a melting PSU port with a 9070xt:


Anyway, that post I made earlier about whether using 2 ports on the PSU with two 6+2 cables for the 3 port ASUS prime 9070xt OC now has me worried so I'm looking into ordering additional cables. Any recommendations?

Looks like the Redditor has an ooooold PSU but still. Tired of hearing about these expensive pieces of equipment becoming problems.
 
  • Thunk-Provoking
Reactions: Brain Problems
While I'm waiting for my GPU, I check out the AMD subreddits and saw someone ran into a melting PSU port with a 9070xt:


Anyway, that post I made earlier about whether using 2 ports on the PSU with two 6+2 cables for the 3 port ASUS prime 9070xt OC now has me worried so I'm looking into ordering additional cables. Any recommendations?

Looks like the Redditor has an ooooold PSU but still. Tired of hearing about these expensive pieces of equipment becoming problems.
You sure your PSU didn't come with three PCI power cables? If it has three slots for them then it should've came with a full set. Mine did, though my 3090 is only 2x8pin so I only needed two cables. Ordering cables can also be an issue since the connectors on the PSU side aren't standardized, so it would be wise to double check them with a multimeter just to be sure, and if you can, try and contact the manufacturer's tech support directly instead of looking for them online.

And do play around with undervolting. These GPU's are designed to be power hungry when they don't have to be. You can cut the power draw and the heat output at zero performance loss, and if you're lucky, at a performance gain. The only thing you risk is your games or system crashing, but it only takes one software switch to bring it back to normal.
 
  • Informative
Reactions: Prehistoric Jazz
It came with two 6+2 PCI cables and two 8 pin CPU cables. I suppose the unused CPU cable and port would suffice? There are only 2 VGA ports on the PSU.Super flower 750w LEADEX III.webp

Starting to think i should simply order a new PSU as well.
 
Last edited:
How do you pass through the control signalling from the motherboard?
Either splice the Power On wires from the primary to the secondary, or use a custom circuit board, which do exist for use cases like these. This was not a serious suggestion though, I think the cabling requirements for the 9070XT are easily accommodated.
 
Yeah, the power consumption of the card is 304w, and I've watched someone leave the third port unused and still game just fine on the Asus Prime. I've learned quite a bit since I asked my simpleton questions; CPU cables used in place of VGA cables will result in severe issues, and using two 2+6 cables with pigtails will be fine. I don't need a new PSU or additional cables. Looking forward to upgrading the CPU/GPU. Thank you for your consideration, technauts.
 
CPU cables used in place of VGA cables will result in severe issues
You can't plug CPU power into a GPU. You can however do the inverse, and in theory it would send 12V down the ground lines to fry your motherboard, if the PSU/motherboard didn't have the safety checks to not power the system on if that happens.
1742881622345.png
Pin 6 on the EPS cable is square, but on the PCIe cable it's beveled. The male PCIe connector will go into the female EPS connector but not the other way around. I made that mistake once but thankfully some protections kicked in and the system simply didn't power on. Either the motherboard detected it so it didn't start the PSU or the PSU detected it so when the motherboard started it it refused to power on. Whatever it was it saved me from experiencing magic smoke.
 
I'm really fucking tired of converting between binary and metric prefixes. As far as I'm concerned, binary prefixes are obsolete. This isn't the fucking 80s anymore where I care about how much RAM I have. Just do everything in metric. There's no utility in reporting file sizes and other numbers with a binary prefix.
 
  • Agree
Reactions: Lil' Hog
I'm really fucking tired of converting between binary and metric prefixes. As far as I'm concerned, binary prefixes are obsolete. This isn't the fucking 80s anymore where I care about how much RAM I have. Just do everything in metric. There's no utility in reporting file sizes and other numbers with a binary prefix.
I just want accurate numbers (in the operating system at least). Don't tell me it's 16 gigabytes (16 billion bytes exactly) if it's actually ~17.2 gigabytes. Having both the decimal and binary prefixes is fine, or an option/toggle.
 
Back