GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

The 4080 16 GB and 4080 12 GB were 2 different GPUs based on different dies, with the latter using the die number that would usually correspond to a 70-class GPU. The real 4080 has 26.7% more CUDA cores and 46% higher memory bandwidth than the canceled card. Source.

I have seen people arguing that The Card Formerly Known As 4080 12 GB was actually a 4060 Ti or 4060, but I think that's mostly for the lulz.

At least Nvidia now has a chance to repair its lineup before it commits to mistakes, like the knee jerk inclusion of 12 GB in the RTX 3060.

BTW, the 4050 is confrimed.
This has to be mentioned here: nVidia does not make a founders' edition for the 4080 12GB; only the AIBs are.

In their press release cancelling the 12GB model, they showed pictures of people lining up for the 4090, as if to show that Nvidia is really popular. In the meantime, only the AIB -- not Nvidia directly -- suffers from Nvidia's cancellation, because they have to rebrand/rebox those 12GB models. Nvidia gets all the adoration, while everyone else partnering with Nvidia are left with headaches.

People are saying that every time Nvidia makes a move, EVGA looks better and better for abandoning Nvidia. At this rate, Nvidia will have no AIBs left. Are they truly ready to go at the GPU market alone?

Nvidia only made Founders Edition cards for their high-end lineup; are they really going to make their own cards for the lower-end lineup as well?
 
At this rate, Nvidia will have no AIBs left. Are they truly ready to go at the GPU market alone?

Nvidia only made Founders Edition cards for their high-end lineup; are they really going to make their own cards for the lower-end lineup as well?
I heard they aren't ready to do it yet, but maybe they will go for it within a few years.

If Nvidia can make its own cards smaller, cheaper, with better cooling, because they know everything about the cards in advance, then they are clearly superior. They could try buying one of the AIBs and using it to produce in-house custom models to give an illusion of choice, while pocketing all of the profits.
 
I have seen people arguing that The Card Formerly Known As 4080 12 GB was actually a 4060 Ti or 4060, but I think that's mostly for the lulz.
They kinda have a point, for example the memory bus of the 12GB version is/was 192bit and thats the same as the 3060/2060 (non super)/ 1060.
3070/3060ti had a 256 bit one. 2070/2060 Super also had a 256 bit one. 1070 too. Even the 900 series followed this logic.

So its around a 60ti class card. Probably a bit faster than that performance level but equally gimped memory set-up as a base 60 class one.
I guess you could call it the RTX 4076 :tomgirl:.
 
I think it's something more fundamental than that though.

You CAN set up a gaming rig with a 3050ti and an i3 Intel processor and play many games and browse the Internet without much issue, but why are you cheaping out on your computer components? Surely something that you use everyday (you think) deserves higher priced components...
My strat for building any sort of PC? Buy the best you can comfortably afford and buy with upgradability in mind. That way, you won't have buyer's remorse over cheapening out on parts. Or spend money on something you can't buff up with a better CPU/GPU/HD later on. It'll keep you from cost-cutting yourself into a corner.
 
At this rate, Nvidia will have no AIBs left. Are they truly ready to go at the GPU market alone?
Nope.
Nvidia only made Founders Edition cards for their high-end lineup; are they really going to make their own cards for the lower-end lineup as well?
Yep.

Can't look it up at this moment, but there is a cut of an investor video where Jen-Sen says their margins have been cut from upper 60% to lower 60%. Vertical integration is the answer they have chosen to increase those margins. That means no more AIB cards, except maybe very low margin products like the 1030, or possibly Gx108 chips. The top-end cards will continue to get the 4090 treatment to get the views and clicks, but it makes no economic sense to give the 3060 the same attention to detail.

I heard they aren't ready to do it yet, but maybe they will go for it within a few years.

If Nvidia can make its own cards smaller, cheaper, with better cooling, because they know everything about the cards in advance, then they are clearly superior. They could try buying one of the AIBs and using it to produce in-house custom models to give an illusion of choice, while pocketing all of the profits.
Superior products for Nvidia, but the clear loser here will be distributors and retail. They will be treated the way AIBs are treated now, likely meaning that the only way to grab an Nvidia card in the far future will be direct from Nvidia itself.

This is monopolization of a market niche; and unless Intel will burn money for a few more years and bring competition with billions of lobby money, AMD will follow Nvidia's lead as they don't have either's deep pockets.
 
A while back I got a good deal on a 5600x and motherboard so I upgraded. I'm happy with the system but I need a new GPU, I currently have an 1050ti but I don't know what to get. Another problem is my PSU. I have a old Seasonic 520w that I don't think will be appropriate. There's a Corsair RM650 650w on Amazon for £76 that looks good but I don't know what GPU would be good for the system. Anyone got any recommendations for a GPU or PSU?
 
A while back I got a good deal on a 5600x and motherboard so I upgraded. I'm happy with the system but I need a new GPU, I currently have an 1050ti but I don't know what to get. Another problem is my PSU. I have a old Seasonic 520w that I don't think will be appropriate. There's a Corsair RM650 650w on Amazon for £76 that looks good but I don't know what GPU would be good for the system. Anyone got any recommendations for a GPU or PSU?
1080p monitor I assume?
You'll be fine with a 6700xt or 3060-3060ti-3070. The RM650 is great for these. I suspect these good 650W Gold PSUs can carry even a 4080 with a 105W CPU, but long term it might be smarter to get a higher W PSU if you want to keep it for next upgrades, Nvidia has gone insane with the TDP.
Regardless, the upgrade will be very significant from the 1050ti. And a bit overkill for 1080p. Might be worth waiting for AMD to launch its cards and Nvidia to start doing mid range 4000 series.
 
1080p monitor I assume?
You'll be fine with a 6700xt or 3060-3060ti-3070. The RM650 is great for these. I suspect these good 650W Gold PSUs can carry even a 4080 with a 105W CPU, but long term it might be smarter to get a higher W PSU if you want to keep it for next upgrades, Nvidia has gone insane with the TDP.
Regardless, the upgrade will be very significant from the 1050ti. And a bit overkill for 1080p. Might be worth waiting for AMD to launch its cards and Nvidia to start doing mid range 4000 series.
Yeah 1080p 120hz. I do tend to keep PSUs for a long time, I've got my moneys worth out of that old Seasonic after all. Do you think it's worth going used for something like a 3070?
 
Yeah 1080p 120hz. I do tend to keep PSUs for a long time, I've got my moneys worth out of that old Seasonic after all. Do you think it's worth going used for something like a 3070?
It might be, but it's a risk. Some of these cards were forged in the hells of mining, and likely ran at near full load for a year+. It's basically a percentage of additional risk, which might not even be covered by warranty. I had cards die on me even new-ish, without warranty it would've been a significant financial hit. There are RM PSUs at 750W too I think. There are other good ones at decent prices too, but it all depends on where you live, transports, offers etc. The Superflower Leadex series is also worth considering etc., many other options.
 
  • Like
Reactions: Allakazam223
It might be, but it's a risk. Some of these cards were forged in the hells of mining, and likely ran at near full load for a year+. It's basically a percentage of additional risk, which might not even be covered by warranty. I had cards die on me even new-ish, without warranty it would've been a significant financial hit. There are RM PSUs at 750W too I think. There are other good ones at decent prices too, but it all depends on where you live, transports, offers etc. The Superflower Leadex series is also worth considering etc., many other options.
I think I'm going to stay away from the used market, the prices just aren't low enough for me to care and I'm not that poor that every penny counts. GPU prices are better than they where but VAT is fucking me in the ass. I suppose that it doesn't really matter, I upgrade like once every 5 or 6 years.

I'm not the broke faggot I once was but I still can't stop being a stingy jew.
 
The CPU draws 300W, the GPU draws 450W, so your best bet is to get a +1500W PSU, and even then after like one or two generations the power draw will be over 2000W because Intel, AMD and Nvidia have hit a hard wall where they cannot get more performance out of the silicon without increasing the power draw.
 
The CPU draws 300W, the GPU draws 450W, so your best bet is to get a +1500W PSU, and even then after like one or two generations the power draw will be over 2000W because Intel, AMD and Nvidia have hit a hard wall where they cannot get more performance out of the silicon without increasing the power draw.
Luckily energy prices are so reasonable and inexpensive these days so this is a non-issue for most of us. Now if you excuse me, I have to reset the fuse box because I booted up my 1337 gamer rig and it tripped the system. I'll have to remember to unplug the fridge next time, my fault.
 
Now if you excuse me, I have to reset the fuse box because I booted up my 1337 gamer rig and it tripped the system. I'll have to remember to unplug the fridge next time, my fault.
I have actually dumped both GPU and CPU clock and saved them to different profiles.

So I met a guy that seemed very knowledgeable about the minutiae of GPUs and he said that if more than one display was plugged in then it won't down clock to save power. Anyone know anything about that?
 
Back