GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Nvidia have really played themselves with the 3090. It's really not a gaming card - it's a card for ML and rendering. Why else would it need 20GB of VRAM whilst having so little gaming performance gain over the 3080? It really should be marketed as something for prosumer content creators where it actually looks very promising but no - their marketing team had to bill it as the ultimate in gaming. And now it's getting beaten in some benchmarks by the 6800XT and the 6900 when it comes out is going to batter it. And in either case price-performance is easily won by Radeon over it.
 
I... I had one.

There I was at the Old Pigsty, trucking along nicely since 1998 with my Pentium 200MMX which, despite the memes, was a good CPU right up until the year 2000. Could play Unreal Tournament, Tiberian Sun, all the Quakes, System Shock 2, and Baldur's Gate no problem. Then, we got an upgrade! We trucked out to PC World ('member PC World?) in mid 2001 and got ourselves a lovely new prebuilt from Packard Bell in a rather swish blue case. In fact, it looked just like this. GeForce 2 GPU, 256 MB RAM, 40 GB hard drive... and a Pentium 4 at 1.5 GHZ.

So, bragged about it the next day to school friends. One of them told me he had an Athlon 900. This was a CPU that was a year older and went up against the Pentium III at the time, but he reckoned it was faster. But me, being the brainless proto-consoomer I was, say, "is not! Yours is only 900 MHz while mine's 1,500 MHz! S'there!" So we agreed to go away and benchmark it.

It was ogre. The P4 was humiliated.

Turns out that the Pentium 4 was really, really fast at number crunching and performing loads of mathematical and logical operations so long as it didn't have to rely upon other things or carry out too many if/then/else type operations. This is because it had a yuge pipeline and this was how they got it to for the time dizzying high speeds. But if it predicted the wrong branch, it had to stop, flush the pipeline, and start again.

Despite this, when I went off to university I acquired a Pentium 4 based laptop (which oddly enough had a desktop class P4 at 2.8 GHz) and so never learned. Its battery life was shit and there was a torrent of hot air permanently emerging from the side vents.

But yeah, the Pentium 4 was an architecture that started out poorly, became adequate two years too late with the Northwood core, and then took a hard turn into awfulness again with Prescott and Cedar Mill which had the added fun bonus of cooking themselves. Yet despite this, there was worse. The Pentium 4 based Celerons, for instance. In 2010 I started my training contract and the PCs at the firm were all old XP based boxes running Celeron D processors. Despite the name, these were single core and gimped versions of Prescott Pentium 4s. They are possibly the worst CPU I have ever, ever, used. Literally having Word, Excel, our case management software, and Firefox open all at once would bog them to rage-inducing levels, and having more than three Firefox tabs open at once would make them lock up totally for about two or three minutes.

EDIT: If you want to experience the agony of the Celeron D, you can buy TWO for literally less than the cost of a pint.
We got a new Pentium 4 PC in 2002 around the time of my 14th birthday, it was a family PC and looking back it was as slow as fuck, a 1.4 or 1.5 CPU, 512mb of RAM and Windows XP. We planned to upgrade the RAM but it was RDRAM and a prohibitively expensive upgrade at the time. After two years our PC died when the capacitors on the mainboard went bad and burst open, from what I have read bad-capacitors were a problem for early to mid 2000 PCs.
 
  • Feels
Reactions: Smaug's Smokey Hole
We got a new Pentium 4 PC in 2002 around the time of my 14th birthday, it was a family PC and looking back it was as slow as fuck, a 1.4 or 1.5 CPU, 512mb of RAM and Windows XP. We planned to upgrade the RAM but it was RDRAM and a prohibitively expensive upgrade at the time. After two years our PC died when the capacitors on the mainboard went bad and burst open, from what I have read bad-capacitors were a problem for early to mid 2000 PCs.

Capacitor plague. Wasn't that the result of a Chinese industrial spy stealing a discarded formula for electrolyte from the bins at Japanese chemical conglomerate Rubycon and then spreading it around to make knockoffs with names like Rulycon?

Nvidia have really played themselves with the 3090. It's really not a gaming card - it's a card for ML and rendering. Why else would it need 20GB of VRAM whilst having so little gaming performance gain over the 3080? It really should be marketed as something for prosumer content creators where it actually looks very promising but no - their marketing team had to bill it as the ultimate in gaming. And now it's getting beaten in some benchmarks by the 6800XT and the 6900 when it comes out is going to batter it. And in either case price-performance is easily won by Radeon over it.

Why it isn't the Titan RTX II, I have no idea. I suspect they were worried enough by Big Navi and the 6900XT in particular that they thought they'd best make it all GAMING.

Ah yes, the 6900XT. There's another card that we won't be able to buy because it'll be sold out in five minutes to scalpers using pre-paid credit cards at their work address or other ways around the one per customer limit.
 
I remember having a Chaintech board that let me set CPU speed multiplier etc. in the BIOS (as opposed to jumpering) first time I've ever seen that back then. It didn't work well. That was a Pentium.

ALI chipsets where in a lot of embedded x86 and could often be pushed to the absolute limit, SiS used to be good value for the money and they also made an x86-SoC, the Vortex86. Cyrix also was pretty good value for the money in general, if you needed a cheap upgrade for a system you already had. AMD based their later SoCs on their MediaGX line. intel chipsets could be surprisingly dodgy sometimes and expensive, and often they weren't that good value for the price because intel just loved to nickel and dime features. (intel BX chipset was killer though) If you go farther in the past things get even more interesting. AMD used to be a big electronics parts supplier, programmable gate arrays for example. Western Digital used to make graphics chips, quite decent ones even. This stuff used to be very interesting and you had a plethora of hardware that all had slightly different features and shopping around and reading up on it was sometimes a science in itself. Chipset also totally could affect your computer's speed, sometimes quite dramatically, at the same specs. Today the selection just isn't that big, you buy the bigger number, you get a faster computer. More reliable but also kinda more boring. Doe VIA even still make CPUs?

Anybody remembers intels fuckery with the coppermine P3s that were actually unstable above 1 Ghz? They were also the first to implement software-readable hardware serial numbers on the P3 in 1999 which caused quite the uproar. (funny and almost quaint considering what a nightmare that stuff is now) Intel was always shady AF.
Voodoo cards were aptly named, either a game worked or it didn't, back when I had to go into town to use the dialup at the library for troubleshooting.

My brother got a brand spanking new Athlon system, with the Voodoo3 (I think, been a long time now) and when it played nice, it was great. If it didn't want to launch a game, it would let you know.
 
Capacitor plague. Wasn't that the result of a Chinese industrial spy stealing a discarded formula for electrolyte from the bins at Japanese chemical conglomerate Rubycon and then spreading it around to make knockoffs with names like Rulycon?
Someone copied the formula in a very Chinese way so it wasn't entirely correct. In China the faulty formula was copied/stolen yet again and was used in Taiwan, people generally trusted stuff coming out of Taiwan. Abit, previously my favorite IHV because of their BX boards, had a real bad problem with caps going bad, that significantly hurt them in the eyes of enthusiasts.

Among the features Intel removed from their products was being able to run two Celerons on the BX chipset. It could also use ECC ram with parity so it was possible to build a nice little server on the cheap.
Voodoo cards were aptly named, either a game worked or it didn't, back when I had to go into town to use the dialup at the library for troubleshooting.

My brother got a brand spanking new Athlon system, with the Voodoo3 (I think, been a long time now) and when it played nice, it was great. If it didn't want to launch a game, it would let you know.
Buying four computer magazines and comparing the information for a part sucked, internet really helped out there. I don't really remember how I ordered parts before the internet now that I think about it, maybe I sent a letter or called the company in the print ads...

Then a large and well stocked computer store opened up in the basement of an apartment building, cheap rent I guess, they had a no questions asked return policy that I abused primarily with graphics cards. The computer store went bust a couple of years later and I don't really understand how they could have mismanaged it that badly.
 
Fun fact, the P4 connector on motherboards isnt called the P4 because it has 4 pins, it's called the P4 because its named after the pentium 4 chips that necessitated its creation.
Nvidia have really played themselves with the 3090. It's really not a gaming card - it's a card for ML and rendering. Why else would it need 20GB of VRAM whilst having so little gaming performance gain over the 3080? It really should be marketed as something for prosumer content creators where it actually looks very promising but no - their marketing team had to bill it as the ultimate in gaming. And now it's getting beaten in some benchmarks by the 6800XT and the 6900 when it comes out is going to batter it. And in either case price-performance is easily won by Radeon over it.
Watching Nvidia get rekt by AMD warms the damp creases of my heart. Not since the first evergreen island GPUs has AMD had this kind of advantage.
 
47 minites til AiB 6800 (xt) drops, anyone else really tempted to try and beat the bots? My 2060 super just aint enough to push 4k consistently except in old games. BTW, when did War Thunder add DLSS? It makes a big difference. 78c @50 fps (swinging) vs 58c @60fps (flat)
 
47 minites til AiB 6800 (xt) drops, anyone else really tempted to try and beat the bots? My 2060 super just aint enough to push 4k consistently except in old games. BTW, when did War Thunder add DLSS? It makes a big difference. 78c @50 fps (swinging) vs 58c @60fps (flat)
I believe they added it with the last big patch back on the 17th (I think)
 
  • Informative
Reactions: Allakazam223
I believe they added it with the last big patch back on the 17th (I think)
I was suprised. I have a 43" and I sit pretty far away, so I may miss more than someone closer with a smaller monitor, but I couldn't see any really bad fuzziness or texture popping. Smoother and quieter.

AMD AiB paper airplane launch edition, btw. No xmas GPU for me.
 
I was suprised. I have a 43" and I sit pretty far away, so I may miss more than someone closer with a smaller monitor, but I couldn't see any really bad fuzziness or texture popping. Smoother and quieter.

AMD AiB paper airplane launch edition, btw. No xmas GPU for me.
Still waiting on the eventual 6700. I don't know if I can really justify getting a 6800...though I do run 1440p@144.

But house stuff is more important anyways, so that's a good excuse to wait :D
 
  • Feels
Reactions: Allakazam223
im planning on getting myself a new card to replace my 750 ti. preferably a budget option that could maybe do budget vr like a used 1060 6gb. i'd go with something more powerful, but im limited by my 2200g and also the fact im getting either a new monitor or a graphics tablet around december. havent decided yet
 
  • Like
Reactions: Allakazam223
im planning on getting myself a new card to replace my 750 ti. preferably a budget option that could maybe do budget vr like a used 1060 6gb. i'd go with something more powerful, but im limited by my 2200g and also the fact im getting either a new monitor or a graphics tablet around december. havent decided yet
If there is actual AMD/Nvidia stock on the new gpus, you may be better off JustWaiting*tm* until you nail down what monitor you are getting, lots of used GPUs Should*tm* be going up for sale, just dont pay a scalper or any retailer scalping. That being said, I'm already looking at getting $250 less for my GPU I got in the spring vs what I paid for it, although it seems that the prices on the RDNA1/Turing cards hasn't changed much.

I think Nvidia die yields are low, which is why 3070s seem to keep getting restocked much faster, damaged parts lasered off and lower binned. It could also be that the AiB's had to retool lines to match FE power spec, and are starting the lines back up.
AMD's problem is that they are splitting yield between two new consoles and three new cards, and each of those dies are being split between 6(?) AiB partners, which are all splitting those into 2 or more SKU's.

Maybe the opposite is happening as Nvidia's dies. So many RDNA2 chips are binning at 6900xt quality that there isn't many that don't exceed 6800xt specs, and there isn't any point in damaging higher quality silicon just to meet demand on the lower SKU. :optimistic:
 
  • Feels
Reactions: Smaug's Smokey Hole
Right now absolutely sucks for building a PC. I need a new one for mid-december (hard requirement), and it's looking like a 3600xt + 5700xt. UK parts availably on new Ryzen and Nvidia parts is fucked. 6800xt and 6800 may as well not exist!

Anyone got any ideas? I don't care about the platform, it just feels like I'm getting bad value for money!
 
  • Like
Reactions: Allakazam223
Right now absolutely sucks for building a PC. I need a new one for mid-december (hard requirement), and it's looking like a 3600xt + 5700xt. UK parts availably on new Ryzen and Nvidia parts is fucked. 6800xt and 6800 may as well not exist!

Anyone got any ideas? I don't care about the platform, it just feels like I'm getting bad value for money!
How much power do you need for your hard requirement? If you dont need a ton of power, you can pick up a business desktop that can host an expansion GPU for dirt cheap. You could put together, say, a dell OptiPlex with a quad core i7 and a 480 for $250 and save your money for when the modern parts are actually available.

If you need modern more powerful stuff right away, honestly I'd say pick up a good x570 motherboard and a ryzen 3100. If you want 6 cores just get a 4600, there is literally no point to the XTs, they are no faster. Get the absolute cheaperst 6 core ryzen 3000 series if you want 6 cores.

For graphic power, you can pick up a vega 64 for $230-250 off of ebay, or a vega 56 for $180, which can play at 1440p. I have one, absolutely love it, and will be sticking with it unless someone finally makes a 1600p monitor thats affordable or I find a 1440p monitor that can match my asus proart monitor in color reproduction.
 
Why it isn't the Titan RTX II, I have no idea. I suspect they were worried enough by Big Navi and the 6900XT in particular that they thought they'd best make it all GAMING.
It's not called a Titan RTX II due to it's insane power usage, people tend to buy Titan cards for workstation performance and sips power at the same time. I believe the Ampere cards in general just guzzle power and have worse performance per wattage use since the Fermi cards.
 
I think Nvidia die yields are low, which is why 3070s seem to keep getting restocked much faster, damaged parts lasered off and lower binned. It could also be that the AiB's had to retool lines to match FE power spec, and are starting the lines back up.

This would explain quite a lot. But then again, here in Bongland you can order a RTX 3090 now on Scan, CCL, or Novatech and have it next day delivery. They have free stock of those ones. Problem is, there is absolutely no point in paying £1,700 for such a card for gaming when it's only 15-20 percent faster than the 3080. And they're still getting deliveries.

Sounds to me like Nvidia overestimated how many people would be shameless enough consoomers to pay so much for a GPU.

Meanwhile, another online store states that there is no stock available of the AIB Radeon 6800s on launch day. I didn't think it was possible to produce thinner vaporware than the 3080 but somehow, it has been done.

It's not called a Titan RTX II due to it's insane power usage, people tend to buy Titan cards for workstation performance and sips power at the same time. I believe the Ampere cards in general just guzzle power and have worse performance per wattage use since the Fermi cards.

So it's a failed Titan then.

I have to say that by the sounds of it AMD had an open goal here. Their competition put out vaporware that ate power, and they still didn't capitalise on it. They should have put the launch back a few weeks and stockpiled like gangbusters.
 
  • Agree
Reactions: Allakazam223
So it's a failed Titan then.
I wouldn’t even call it that, Nvidia knew what they were doing with the marketing, to get people calling it a Titan-Class card. But I will say this: I doubt we will get a Titan from Ampere though, we’re going to see them go crawling back to TSMC at this rate.
 
  • Like
Reactions: Allakazam223
Right now absolutely sucks for building a PC. I need a new one for mid-december (hard requirement), and it's looking like a 3600xt + 5700xt. UK parts availably on new Ryzen and Nvidia parts is fucked. 6800xt and 6800 may as well not exist!

Anyone got any ideas? I don't care about the platform, it just feels like I'm getting bad value for money!
If I were getting anything right now, which I wouldn't because you're right about everything sucking, Intel has shockingly become the good buy. They've lowered the 9900k to $320 now on Newegg.

GPUs are absolute trash, though. All you can do is search for used stuff and hope. Really just have to wait until NVidia/AMD stop their silly shenanigans with the new cards so everything else can finally drop. I guess get one of the ol' reliable AMD cards like RX 570-580 for $100-120 or a Vega if you need something now.
 
If I were getting anything right now, which I wouldn't because you're right about everything sucking, Intel has shockingly become the good buy. They've lowered the 9900k to $320 now on Newegg.
I still wouldn't reccomend though in the US, whereas you can get a 3700X at around 280 to 300 quid and couple with a B450 board, that's a better purchase and you can save money for a better vidya card
 
  • Agree
Reactions: Allakazam223
Back