GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Why would you buy the 3080ti next year when the 4000 series is only 18 months away? You can wait forever but you gotta buy something sooner or later.

I mean, imagine if he HAD waited. He would have gone 12-18 months without the 1660 super only to watch the 3000 series remain totally out of stock for 4+ months. And they haven't released a 3000 series in that price range and likely wont for another 4-6 months, assuming stock ever comes in.

Even buying a 2080ti in july, when we knew the 3000 series was inbound for october, would normally be a waste, but given the stock issues that guy with the 2080ti can play games, unlike those who sold their cards thinking theyd pick up a 3080.
If you can play games in July you can play games in October. My point is that there has not been enough time for developers to challenge the new cards except for the shithole mess that is CP77. So if you can run games with your old card in July there is no reason to upgrade and you're wasting you're money. The 3000s are simply too massive of an improvement to pass up. Instead of buying a used 2080ti just buy a 3070.

If your GPU is something like a 1080 or up, you can run everything on the market, including CP77, at more than acceptable levels, mostly High/Ultra, so why buy the 3080 when we know for a fact that the 3080ti will be massive improvement in performance for a disproportionately low markup?
 
  • Agree
Reactions: Allakazam223
If you can play games in July you can play games in October. My point is that there has not been enough time for developers to challenge the new cards except for the shithole mess that is CP77. So if you can run games with your old card in July there is no reason to upgrade because you're wasting you're money. The 3000s are simply too massive of an improvement to pass up. Instead of buying a used 2080ti just buy a 3070.
WTF are you smoking? The 3080 is already a mostly complete GA102 die. At best the 3080ti would have 5% more cores, because Nvidia will still segment the 3090 from the 3080ti. "massive improvement" my ass.
If your GPU is something like a 1080 or up, you can run everything on the market, including CP77, at more than acceptable levels, mostly High/Ultra, so why buy the 3080 when we know for a fact that the 3080ti will be massive improvement in performance for a disproportionately low markup?
And you already answered your own question. If you have something 1080 or up you can already max out everything on the market. So why wait another 6+ months to see if the mythical ampere will be in stock when a card based on turing already meets your needs and is currently available for purchase? Unless you are spending $700+ on a GPU, the performance you would get out of lower end 3000 series is already available today, sure you could wait another year and get it for $100 less, but you could wait 2 years and get it for $200 less, or three years and $300 less. Meanwhile you are not playing ANYTHING because you wont buy a card.

There will always be something new on the horizon. A new generation, a price drop, something. Look at all the fools who sold their 2080tis only to go 3+ months with no gaming GPU at all because there is no stock. Wow, great deal, all worth it when you had a perfectly good GPU already.
 
  • Like
Reactions: Smaug's Smokey Hole
I got mine for $220 last December, and used 1070 was never under $200, more like $250 for decent cooler ones
The fuck you can’t, if you swindle right you can pick up a 1070 locally for 160 to 180. You have to play the waiting game as well with some buyers. Lowball then go up, most sellers will meet you halfway.

Same for people buying 3000s, why would you buy a 3080 when the 3080ti is going to drop Q1?
Because FOMO and the 3080ti is going to be at least 100 bucks more expensive than the 3080
 
  • Like
Reactions: Allakazam223
The worst is flagship phones and their GPUs. Teens use phones to play Fortnite so 3D isn't wasted on them but I've met people that get hyped over phone performance and when asked 'why' they don't know, they just got swept up in the hype. PC GPU consooomerism is almost sane and rational when compared to that.
Mobile gaming to me means Game Boy and PSP, but I suppose I'm just a crusty old boomer as far as the Fortnite crowd is concerned. The only reason I bought a new flagship phone (LG V60) was because it had the features I needed and wanted - in-screen biometrics is still magic to me for a fraction of the other brands' price. I also had an old S7 Edge that was finally starting to go to the big e-waste pile in the sky. I entirely expect the current device to last me into 2025.

WTF are you smoking? The 3080 is already a mostly complete GA102 die. At best the 3080ti would have 5% more cores, because Nvidia will still segment the 3090 from the 3080ti. "massive improvement" my ass.

And you already answered your own question. If you have something 1080 or up you can already max out everything on the market. So why wait another 6+ months to see if the mythical ampere will be in stock when a card based on turing already meets your needs and is currently available for purchase? Unless you are spending $700+ on a GPU, the performance you would get out of lower end 3000 series is already available today, sure you could wait another year and get it for $100 less, but you could wait 2 years and get it for $200 less, or three years and $300 less. Meanwhile you are not playing ANYTHING because you wont buy a card.

There will always be something new on the horizon. A new generation, a price drop, something. Look at all the fools who sold their 2080tis only to go 3+ months with no gaming GPU at all because there is no stock. Wow, great deal, all worth it when you had a perfectly good GPU already.
Tech nerds at the bleeding edge of consumption get cut by their own impulsiveness - more news at 11.

Until games become more genuinely demanding and higher resolution displays become dirt cheap I don't see a need for anything more than a mid-tier graphics card. Resolution has hit the point of diminishing returns so most developers don't bother accommodating anything above 4K.
Framerate also has a limit (most people can't discern frame rates above 100Hz) but the big bottleneck is display tech, not GPUs. In short, you can run almost every recent title at 1080p60Hz on pretty cheap hardware, there's no reason to buy 3000 series cards if that's what you're after.
 
WTF are you smoking? The 3080 is already a mostly complete GA102 die. At best the 3080ti would have 5% more cores, because Nvidia will still segment the 3090 from the 3080ti. "massive improvement" my ass.

And you already answered your own question. If you have something 1080 or up you can already max out everything on the market. So why wait another 6+ months to see if the mythical ampere will be in stock when a card based on turing already meets your needs and is currently available for purchase? Unless you are spending $700+ on a GPU, the performance you would get out of lower end 3000 series is already available today, sure you could wait another year and get it for $100 less, but you could wait 2 years and get it for $200 less, or three years and $300 less. Meanwhile you are not playing ANYTHING because you wont buy a card.

There will always be something new on the horizon. A new generation, a price drop, something. Look at all the fools who sold their 2080tis only to go 3+ months with no gaming GPU at all because there is no stock. Wow, great deal, all worth it when you had a perfectly good GPU already.
The 3080 Ti could be "Samsung un-fucked their process, we can run a higher clock speed now" combined with sticking another vram chip there for a total of 11GB and 10% higher memory bandwidth.

When it comes to upgrading I agree but with the addition of upgrade when you need it, don't upgrade for the sake of it, that way you won't get stuck and tortured in consoomer hell like the people that sold their 2080's, couldn't get any 3080's, then the 6800 came out and... If you stick to the reasonable price/performance level that you need and assess what that is worth to you then you will never feel gypped. Unlike me when I bought the TNT2 Ultra for no reason other than it being released.
 
  • Like
Reactions: Allakazam223
The 3080 Ti could be "Samsung un-fucked their process, we can run a higher clock speed now" combined with sticking another vram chip there for a total of 11GB and 10% higher memory bandwidth.

When it comes to upgrading I agree but with the addition of upgrade when you need it, don't upgrade for the sake of it, that way you won't get stuck and tortured in consoomer hell like the people that sold their 2080's, couldn't get any 3080's, then the 6800 came out and... If you stick to the reasonable price/performance level that you need and assess what that is worth to you then you will never feel gypped. Unlike me when I bought the TNT2 Ultra for no reason other than it being released.
I highly doubt it. The samsung node isnt particularly fucked when it comes to clock speed. It wasnt really meant for large dies or high clock speeds. You can see that in the power usage at 1900mhz VS 2000 mhz on a 3080, it goes through the roof for relatively little gain. It worked perfectly for the 1050 and 1050ti, but Nvidia decided to put their big GA102 on it to save a few pennies. For GA106 and GA107 it would work perfectly fine, but GA104 is pushing it and GA102 is just too big to get good yields. Even with good yields we'd just see more chips, smsung 8nm was never meant to be a high clock node like TSMC's 7nmP.

There have also been enough leaks showing 12GB 3060s and 20GB 3080tis that I cant believe nvidia would be tone deaf enough to launch a 11GB 3080ti 4 years after the 11GB 1080ti, especially while AMD is rocking 16GB. Then again if the 2Gb modules from Crucial are not ready yet Nvidia wont have a choice.
 
I'm fascinated whether dedicated cards will even be in the marketplace towards the end of this decade.
We're reaching a point where resolution can only be so refined, refresh rates can only matter so much, and shit like ray tracing is almost purely gimmicky puddle reflections and more detailed shadowing.

Yes, technology explodes i.e. Moore's Law but graphically to the extent that it matters, there is a finite point that GPU makers know is out there.
3D wasn't the next leap. VR, maybe.

It just makes me think that Nvidia and AMD better be enjoying this time when their products are so hot that you can't even find them.
Because the point will come, probably around 8K, when the horsepower will be there to run 144Hz and no amount of rebranding can stop that.
 
I'm fascinated whether dedicated cards will even be in the marketplace towards the end of this decade.
We're reaching a point where resolution can only be so refined, refresh rates can only matter so much, and shit like ray tracing is almost purely gimmicky puddle reflections and more detailed shadowing.

Yes, technology explodes i.e. Moore's Law but graphically to the extent that it matters, there is a finite point that GPU makers know is out there.
3D wasn't the next leap. VR, maybe.

It just makes me think that Nvidia and AMD better be enjoying this time when their products are so hot that you can't even find them.
Because the point will come, probably around 8K, when the horsepower will be there to run 144Hz and no amount of rebranding can stop that.
Feels.

But I said the same sort of things fifteen years ago when I was younger and dumber.

Marketers will find a way to sell anything to younger and dumber people. Right now they call it ray tracing. Before that it was 4K. VR. 60fps. 1080p. Data streaming. You remember bump mapping? Kasumi's nipples never looked so good. RAM discs, guys! Who wants onboard video!? Pentium Four ohmahgaaaaaaaaaad.

I had an old computer in the early 90s that had a switch on it that would increase the voltage to the CPU. It let me play Raptor really fast. Marketers will find a way.

I am more worried about crypto continuing to ruin everything it touches.
 
So I found one reason why they were so scarce...they just found 500,000 3080s in a shipping container because the paperwork was fucked up.


Jesus christ. Well that should solve some supply issues.

I don't really understand the business model behind mining with GPUs. I took a look around on the mining subs on Reddit, and it seems like people will buy up like 50 GPUs to make a rig that will spit out $1000/month of crypto profit, more or less. But you have to pay like $20000+ just to get set up with all that hardware, so you're not going to make an overall profit for a couple years. And that's ignoring the constantly increasing requirements on hashrate. Do people actually make an appreciable amount of money doing this? What am I missing?
Because these retards watch "I MADE $495838295 OFF CRYPTO WITH ONE GPU" YouTube videos and then they get fucking retarded. They believe everything they see about crypto. The good thing is eventually it will become too unprofitable to farm with GPUs.
If you can play games in July you can play games in October. My point is that there has not been enough time for developers to challenge the new cards except for the shithole mess that is CP77. So if you can run games with your old card in July there is no reason to upgrade and you're wasting you're money. The 3000s are simply too massive of an improvement to pass up. Instead of buying a used 2080ti just buy a 3070.

If your GPU is something like a 1080 or up, you can run everything on the market, including CP77, at more than acceptable levels, mostly High/Ultra, so why buy the 3080 when we know for a fact that the 3080ti will be massive improvement in performance for a disproportionately low markup?
WTF are you smoking? The 3080 is already a mostly complete GA102 die. At best the 3080ti would have 5% more cores, because Nvidia will still segment the 3090 from the 3080ti. "massive improvement" my ass.

And you already answered your own question. If you have something 1080 or up you can already max out everything on the market. So why wait another 6+ months to see if the mythical ampere will be in stock when a card based on turing already meets your needs and is currently available for purchase? Unless you are spending $700+ on a GPU, the performance you would get out of lower end 3000 series is already available today, sure you could wait another year and get it for $100 less, but you could wait 2 years and get it for $200 less, or three years and $300 less. Meanwhile you are not playing ANYTHING because you wont buy a card.

There will always be something new on the horizon. A new generation, a price drop, something. Look at all the fools who sold their 2080tis only to go 3+ months with no gaming GPU at all because there is no stock. Wow, great deal, all worth it when you had a perfectly good GPU already.
The specs for the 3080ti have already leaked:


It's going to have 20 gigs of ram instead of 10.

The thing is though if you go wait for this....wait for this....wait for this....you'll be eternally waiting. Not to mention that GPUs seem to be like the one product in existence that are eternally unavailable and don't go for their list price. Sure the 3080 Ti is gonna be $999 or $1100 but where are you going to find it for that price? How are you going to beat the bots and miners? How much longer are you going to wait for when supply is finally up to their MSRP numbers?

I mean, I might MIGHT wait for the 3080 Ti if its released in February. It depends. I do NOT want to wait 30 fucking years to buy a piece of hardware that is never sold at its list price.

Sometimes you just want to get it over with instead of playing groundhog day for a year.
 
So I found one reason why they were so scarce...they just found 500,000 3080s in a shipping container because the paperwork was fucked up.


Jesus christ. Well that should solve some supply issues.
nowaywegotya.jpgnowaywegotyaitneverhappened.jpg
There is a news story gaining a bit of traction claiming that 500,000 GeForce RTX 30 series graphics cards had gone missing during transit, and have now been found in an unlabeled shipping container in South Korea. There is even a supposed quote from Nvidia on the matter, explaining the situation to shareholders. The only problem is, none of it actually happened.

That is a bit of a buzzkill, because suddenly finding half a million missing Ampere GPUs could help alleviate the frustration related to the lack of supply (compared to demand). It's now been more than three months since the first consumer Ampere part released to retail, the GeForce RTX 3080, and it remains hard to find in stock by a first-party seller. Same goes for the cards that have come after it (GeForce RTX 3090, 3070, and 3060 Ti, in that order).

Original reporting on the supposedly missing GPUs traces back to Geeknetic.es, which according to a Google translation of the text, wrote that an employee at a Samsung subsidiary port warehouse in South Korea noticed that stacks of boxes with Nvidia's logo stamped on them were not listed on any of the port's records.

It goes on to state that the employee had himself been trying to buy a GeForce RTX 30 series graphics card to no avail, and seeing the boxes of cards "only added to my frustration of not getting one."

Pictures of the boxes in a shipping container made the article seem convincing, save for the fact that half a million GPUs is a rather large amount of inventory to go missing—I imagine if something like that ever happened, Nvidia would be ultra-diligent in tracking down the lost cards.

After the news started to spread, however, the article's author added an update saying, "This is a fake news story created and published on December 28 in celebration of the Holy Innocents Day in Spain. The content of it is false and has been created with a satirical humorous purpose. We hope you had fun reading it."

Apparently December 28 in Spain is somewhat akin to April Fools' Day, in which it is common for false news stories to be posted, usually with a humorous slant.

That said, if you happen to work at a port warehouse, do us DIY system builders a solid and see if any unmarked shipping containers are housing boxes of Ryzen 5000 series CPUs.
 
I'm fascinated whether dedicated cards will even be in the marketplace towards the end of this decade.
We're reaching a point where resolution can only be so refined, refresh rates can only matter so much, and shit like ray tracing is almost purely gimmicky puddle reflections and more detailed shadowing.

Yes, technology explodes i.e. Moore's Law but graphically to the extent that it matters, there is a finite point that GPU makers know is out there.
3D wasn't the next leap. VR, maybe.

It just makes me think that Nvidia and AMD better be enjoying this time when their products are so hot that you can't even find them.
Because the point will come, probably around 8K, when the horsepower will be there to run 144Hz and no amount of rebranding can stop that.
You're starting to sound like Tim Sweeney in 1999. I like it. To be fair to him he sort of had a caveman notion about what we now know as SOCs or APUs where a discrete graphics accelerator wasn't needed, but he phrased it as the CPU becoming powerful enough to do hardware accelerated 3D. Many years later(2-3 years, time was different in the past) Ken Kutaragi had similar ideas with CELL...
Feels.

But I said the same sort of things fifteen years ago when I was younger and dumber.

Marketers will find a way to sell anything to younger and dumber people. Right now they call it ray tracing. Before that it was 4K. VR. 60fps. 1080p. Data streaming. You remember bump mapping? Kasumi's nipples never looked so good. RAM discs, guys! Who wants onboard video!? Pentium Four ohmahgaaaaaaaaaad.

I had an old computer in the early 90s that had a switch on it that would increase the voltage to the CPU. It let me play Raptor really fast. Marketers will find a way.

I am more worried about crypto continuing to ruin everything it touches.
It's all about adoption of technology, only then can it be used for something. You mention bump mapping which finally got popular with the GeForce3 but Nvidia's Dot3 bump mapping had been part of the fixed function pipeline since the GeForce 256. No one used it even though it was pretty good, just like the really nice EMBP on the G400.
In 1999, which one would make you go "ooh"
g400nobump.jpgg400embp.jpg
Now it looks like absolute shit which is always true of any graphical trickery.

It's better now. First gen T&L was useless but lead to programmable vertex shaders, fixed function trickery opened up for programmable pixel shaders even if looking at the shiny water in Morrowind would be the most you got out of it before the GF3 was antiquated.

The first gen of anything new in GPUs will always be useless, if it's vendor specific it will be at least five times as useless(what was that ATi feature that smoothed models based on normals? Valve put it in Half-Life and CS I think). That's where market adoption comes in and it's always useful to look at the entry level feature set and parity between GPU manufacturers. Raytracing didn't go anywhere because only Nvidia supported it, now AMD supports it as well as Sony and Microsoft.
 
So is upgrading from GTX 970 to RTX 3070 and from i5-4460 to Ryzen 5600x worth it? Not talking about scalped prices, regular launch prices.

I got a relatively new 32in 1440p144 monitor and would like to utilize it properly.
 
  • Feels
Reactions: Allakazam223
So is upgrading from GTX 970 to RTX 3070 and from i5-4460 to Ryzen 5600x worth it? Not talking about scalped prices, regular launch prices.

I got a relatively new 32in 1440p144 monitor and would like to utilize it properly.
Yeah, you're going to hurt at 1440p, never mind 144 FPS.
 
View attachment 1816697View attachment 1816702
There is a news story gaining a bit of traction claiming that 500,000 GeForce RTX 30 series graphics cards had gone missing during transit, and have now been found in an unlabeled shipping container in South Korea. There is even a supposed quote from Nvidia on the matter, explaining the situation to shareholders. The only problem is, none of it actually happened.

That is a bit of a buzzkill, because suddenly finding half a million missing Ampere GPUs could help alleviate the frustration related to the lack of supply (compared to demand). It's now been more than three months since the first consumer Ampere part released to retail, the GeForce RTX 3080, and it remains hard to find in stock by a first-party seller. Same goes for the cards that have come after it (GeForce RTX 3090, 3070, and 3060 Ti, in that order).

Original reporting on the supposedly missing GPUs traces back to Geeknetic.es, which according to a Google translation of the text, wrote that an employee at a Samsung subsidiary port warehouse in South Korea noticed that stacks of boxes with Nvidia's logo stamped on them were not listed on any of the port's records.

It goes on to state that the employee had himself been trying to buy a GeForce RTX 30 series graphics card to no avail, and seeing the boxes of cards "only added to my frustration of not getting one."

Pictures of the boxes in a shipping container made the article seem convincing, save for the fact that half a million GPUs is a rather large amount of inventory to go missing—I imagine if something like that ever happened, Nvidia would be ultra-diligent in tracking down the lost cards.

After the news started to spread, however, the article's author added an update saying, "This is a fake news story created and published on December 28 in celebration of the Holy Innocents Day in Spain. The content of it is false and has been created with a satirical humorous purpose. We hope you had fun reading it."

Apparently December 28 in Spain is somewhat akin to April Fools' Day, in which it is common for false news stories to be posted, usually with a humorous slant.

That said, if you happen to work at a port warehouse, do us DIY system builders a solid and see if any unmarked shipping containers are housing boxes of Ryzen 5000 series CPUs.
Bullshit. That is exactly what they would say so they don't have to stop price gouging.
 
So is upgrading from GTX 970 to RTX 3070 and from i5-4460 to Ryzen 5600x worth it? Not talking about scalped prices, regular launch prices.

I got a relatively new 32in 1440p144 monitor and would like to utilize it properly.
I would hold off on the GPU - like you have a choice right now anyway - as there has been a 12 GB RTX 3060 announced.
As you currently own the GTX 970, you really don't want to get screwed on the memory for a second time.
 
So more 3080 Ti stuff leaked. Its going to come out in February after Chinese New Year. likely announced Jan 12th, around when more stock of 3080s should be arriving. So it looks like I'll just wait and do EVGA's reserve for them. Though there's absolutely no reason as to why this card will launch any better than the previous 3080. But if I act faster than the Christmas shopping season which fucking murdered hardware along with miners and scalpers. It should be mildly better. Mildly.

I wouldn't be shocked if NVIDIA sold most of the 3080 stock to miners in order to make people buy the more expensive, in supply 3080 Ti. So instead of exploiting scalpers, NVIDIA purposefully sold off most of its stock to create a false shortage to make consumers buy the more expensive 3080 Ti. Miners likely aren't going to buy the 3080 Ti when they've already bought up the 3080s and 3090s. But miners are fucking retarded so who knows.
 
Back