GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

A small rant: I'm irritated when people brag to me about their PC builds telling me how difficult it was to put it together. Like physically turning a dozen screws and clicking in a few cards. For the last 20+ years it's been like Legos. Now, if you tell me about your homebrew Z80 bare metal build where you hand-etched the backplane and designed your own display program, that's impressive and I could listen to you for hours.
 
A small rant: I'm irritated when people brag to me about their PC builds telling me how difficult it was to put it together. Like physically turning a dozen screws and clicking in a few cards. For the last 20+ years it's been like Legos. Now, if you tell me about your homebrew Z80 bare metal build where you hand-etched the backplane and designed your own display program, that's impressive and I could listen to you for hours.
The only time I hear people have a difficult time is when they do something stupid like cramming custom loops into cases that really weren't meant for it to chase the SFF fad.
 
A small rant: I'm irritated when people brag to me about their PC builds telling me how difficult it was to put it together. Like physically turning a dozen screws and clicking in a few cards. For the last 20+ years it's been like Legos. Now, if you tell me about your homebrew Z80 bare metal build where you hand-etched the backplane and designed your own display program, that's impressive and I could listen to you for hours.
When people make building a PC sound difficult I usually them that I assembled my most recent build at 2 AM when I was drunk.
 
A small rant: I'm irritated when people brag to me about their PC builds telling me how difficult it was to put it together. Like physically turning a dozen screws and clicking in a few cards. For the last 20+ years it's been like Legos. Now, if you tell me about your homebrew Z80 bare metal build where you hand-etched the backplane and designed your own display program, that's impressive and I could listen to you for hours.
Anyone who thinks building a PC is difficult obviously never built them back in the days before PCI-E, SATA, ZIF sockets, modular PSUs, ATX, metal standoff screws, auto-detection of frequency multipliers and bus speed, cases with nicely machined edges that won't slice your finger open, chipset compatibility issues, etc. Not that it was all that difficult back then either, there were just a lot more things to keep track of and things that could go wrong.
 
My first computer had expansion slots where you could plug in cards the wrong way and that'd cause things to go up in smoke. No keying, just a 100-pin slot. Everything could fit either way.

Earlier LIF Sockets (if you didn't have the right tools) you had to push the CPU down with full force and then some and that wasn't great for the chip. (that's why you were an absolute troglodyte for not using the right tool, which was big, expensive, heavy and looked a bit like a press) It was impossible to remove the chip again without the right tools without damaging chip or socket or both. Intel had a small field tool that resembled a metal comb, and I still have a few of them, they were ok but could also be hit-or-miss if you got unlucky. Modern sockets are absolute engineering marvels (even in more than the convenience way) I don't think many people really can appreciate. I cannot stress that enough.

A lot of these computers also didn't have well protected ports so unplugging anything while the computer was running had a non-zero chance to damage whatever chip was connected to the port. Add to that that they usually cost several times as much as even a high-end PC costs nowadays and you were careful. Or not.

The shitty cases were almost exclusively an artifact of the "cheap PC clone" world though, where bleeding edge technology often met very low production values. (why invest in quality if the part in question would be outdated into uselessness 6 months later anyways?) Mac&IBM and co. had some banger cases, some of them with completely screwless (yes screwless, not even these thumb screws) designs were you could take the entire computer apart and neatly arrange the parts next to each other in 10 minutes with just your hands, if you knew the procedure. Beautiful stuff I never saw quite like that again. I think they assume people would just be too dumb for it. They'd probably be right. The average case nowadays is okay, i guess. For the price. Not great, not terrible. I haven't seen anything that impressed me for a long time though. Also a lot of stuff just looks embarassing and I wouldn't want to have it in my living space.

On top of that a huge selection of different manufacturers (did you know western digital used to make graphics chips? They were even quite decent) with all their own quirks and some stuff that just never worked quite right and it was really a science in itself. Nowadays it really isn't. I'm pretty sure most graphics cards and mainboard companies copy+c, copy+v the reference designs of whatever chipset they use and put their own marketing/cost saving spin on it. (and probably break shit in the process which makes the engineers of the original stuff dead inside) Don't get me wrong, they did that back then too but the selection nowadays isn't really that big anyways so it's pretty much all the same stuff in some areas.
 
Last edited:
The problem with those miner preowned cards is that alot of them are run into the ground and/or have custom miner bioses on them. To me, they are pretty much worthless once a miner gets his grubbies on them.
Same for me as others posted, I've bought around 20 used cards from miners over the years and never had any problems. YMMV, of course. But, miners probably take better care of GPUs better than regular gamers because of better temps, undervolting, general better care, etc...
The complaints are mostly the same as the 6500 XT. It has design compromises from being a laptop-focused die, chiefly the 4x PCIe lanes which hurt performance at PCIe 3.0, and the garbage video decode/encode compared to the other RDNA 2 GPUs. Then it has less performance than the 6500 XT from 25% of the die being cut. All this for $160 or higher.

Folx should have realistic expectations for a small form factor GPU under 75W. But Intel could come in and do it better with their low-end Alchemist GPUs.
People hate the $160 price on the 6400, which I understand but single slot low power cards come with a premium. Good luck finding new low profile 1650s for under $200, so it makes sense. 6500 XT is just a crappy misleading product, though.
A small rant: I'm irritated when people brag to me about their PC builds telling me how difficult it was to put it together. Like physically turning a dozen screws and clicking in a few cards. For the last 20+ years it's been like Legos. Now, if you tell me about your homebrew Z80 bare metal build where you hand-etched the backplane and designed your own display program, that's impressive and I could listen to you for hours.
PC building isn't difficult, but I imagine it's stressful and overwhelming if you're new at it and especially don't have any spare parts. I've saved lots of frustration at least a handful of times just by having other PCs I can test with.
 
My first computer had expansion slots where you could plug in cards the wrong way and that'd cause things to go up in smoke. No keying, just a 100-pin slot. Everything could fit either way.

Earlier LIF Sockets (if you didn't have the right tools) you had to push the CPU down with full force and then some and that wasn't great for the chip. (that's why you were an absolute troglodyte for not using the right tool, which was big, expensive, heavy and looked a bit like a press) It was impossible to remove the chip again without the right tools without damaging chip or socket or both. Intel had a small field tool that resembled a metal comb, and I still have a few of them, they were ok but could also be hit-or-miss if you got unlucky. Modern sockets are absolute engineering marvels (even in more than the convenience way) I don't think many people really can appreciate. I cannot stress that enough.

A lot of these computers also didn't have well protected ports so unplugging anything while the computer was running had a non-zero chance to damage whatever chip was connected to the port. Add to that that they usually cost several times as much as even a high-end PC costs nowadays and you were careful. Or not.

The shitty cases were almost exclusively an artifact of the "cheap PC clone" world though, where bleeding edge technology often met very low production values. (why invest in quality if the part in question would be outdated into uselessness 6 months later anyways?) Mac&IBM and co. had some banger cases, some of them with completely screwless (yes screwless, not even these thumb screws) designs were you could take the entire computer apart and neatly arrange the parts next to each other in 10 minutes with just your hands, if you knew the procedure. Beautiful stuff I never saw quite like that again. I think they assume people would just be too dumb for it. They'd probably be right. The average case nowadays is okay, i guess. For the price. Not great, not terrible. I haven't seen anything that impressed me for a long time though. Also a lot of stuff just looks embarassing and I wouldn't want to have it in my living space.

On top of that a huge selection of different manufacturers (did you know western digital used to make graphics chips? They were even quite decent) with all their own quirks and some stuff that just never worked quite right and it was really a science in itself. Nowadays it really isn't. I'm pretty sure most graphics cards and mainboard companies copy+c, copy+v the reference designs of whatever chipset they use and put their own marketing/cost saving spin on it. (and probably break shit in the process which makes the engineers of the original stuff dead inside) Don't get me wrong, they did that back then too but the selection nowadays isn't really that big anyways so it's pretty much all the same stuff in some areas.

Fuckin' preach! Shit used to be so different back in the day
 
  • Agree
Reactions: Torque Wheeler
You could have just bought a 400$ card now and a new 400$ card in 2-3 years and saved 400$.
I don't know why people insist on "future-proofing" in a market where the high-end becomes midrange in no time.
Everyone else effectively summed it up but for me specifically a combination of market and current build. For some perspective I went into a 3080 from a 1060; I skipped the 2000 gen (because it was shit compared to the 1000) with the intention of getting a 3080, and then Covid and crypto fucked everything up putting that plan off almost two years. Given how ridiculous the market has gotten and the very likely guarantee that the 4000 gen will play out the same, I jumped the moment I could grab something in my price range and with the power I'm looking for.

While true the, say, RTX 4080 could be significantly better than the 3080, just what game could fully leverage it at time of purchase this year or next, and how long would I still be waiting to upgrade after the inevitable paper release? Hell I'm still rocking 1080p monitors because the resolution is good enough for me for the moment; once I'm ready to shift over to 4k I've got a card that will fulfill my needs until games and software bottleneck either my CPU or GPU again and necessitate another upgrade. Fully agreed shit is overpriced, but the market is what it is, Nvidia and AMD have smelled blood, and hoping for improvement only gets you so far. I'd rather get enjoyment and satisfaction now than bank on a return to normal in a world seemingly set on going crazy.
 
  • Agree
Reactions: Judge Dredd
Got a 5700 XT at MSRP (about 400.00 bucks). And haven't found a game that it doesn't murder at 1440p. Even though I feel even at MSRP mid range graphics cards are not a good purchase.

One big issue about gamers is that sometimes we spend way too much on the specs, and way too little on the outputs. If you have a 400 dollar graphics card, please don't buy a 150 dollar monitor.

Read the specs and maximal output the display you plan to purchase, and buy a graphics card that has a similar or inferior output (on resolution and refresh rate). Also, if you prefer single player games, avoid TN monitors. IPS monitors are generally cheaper and have a more accurate and vibrant color pallette (although have worst refresh rates). Another thing is that "gaming" monitors are a mostly a scam, so, when reading the specs of said monitors compare it to bussiness or regularly marketed monitors around the same and lower price ranges and see that sometimes 50hz are not worth a 150 dollar premium.

Finally, the most important thing is to be honest with yourself, if you don't need it, DON'T buy it, entertainment is not worth your financial stability even in the short term.
So you wouldn't be happy with my blurry VGA monitor
 
The shitty cases were almost exclusively an artifact of the "cheap PC clone" world though, where bleeding edge technology often met very low production values. (why invest in quality if the part in question would be outdated into uselessness 6 months later anyways?) Mac&IBM and co. had some banger cases, some of them with completely screwless (yes screwless, not even these thumb screws) designs were you could take the entire computer apart and neatly arrange the parts next to each other in 10 minutes with just your hands, if you knew the procedure. Beautiful stuff I never saw quite like that again. I think they assume people would just be too dumb for it. They'd probably be right. The average case nowadays is okay, i guess. For the price. Not great, not terrible. I haven't seen anything that impressed me for a long time though. Also a lot of stuff just looks embarassing and I wouldn't want to have it in my living space.
Other manufacturers with beautiful high quality cases were SGI, NeXT... both of them were cases that a home (or small shop) pc builder would never, ever, be able to get their hands on and install a PC Chips motherboard into it. Corsair wasn't around 30 years ago so generic tin can cases that sliced your fingers wasn't uncommon. Cases often came with a power supply as well because that part was about as generic as the cable that went into it.

The earliest company that I can remember that made "wow" looking chassis for the enthusiast space was Lian Li some 20 years ago.
This was super nice looking in 2001 and looking at it now it is so ugly.
lianlipc50500.jpg
 
It seems that AMD really likes to shoot themselves in the foot.

First they completely ignore the budget market with the release of the 3000 series.
Then Intel tears AMD a new one with the release of Alder Lake, where the i3-12100 and the i5-12400 became one of the best choices in the budget market.
Then AMD feeling the fire under their asses haphazardly releases sub-par "budget" CPU's for the AM4 platform which become a laughing stock.
And now, AMD is dropping AM4 in favor of AM5 which is DDR5 only, meaning it will be automatically disqualified from any budget setups.

What the fuck are you doing AMD?
 
It seems that AMD really likes to shoot themselves in the foot.

First they completely ignore the budget market with the release of the 3000 series.
Then Intel tears AMD a new one with the release of Alder Lake, where the i3-12100 and the i5-12400 became one of the best choices in the budget market.
Then AMD feeling the fire under their asses haphazardly releases sub-par "budget" CPU's for the AM4 platform which become a laughing stock.
And now, AMD is dropping AM4 in favor of AM5 which is DDR5 only, meaning it will be automatically disqualified from any budget setups.

What the fuck are you doing AMD?
I'm surprised at how good the low end desktop i3 are. I was shopping around to upgrade a computer and the APU suck compared to the i3. We finally have a good i3 with 4c8t under $100.
 
  • Feels
Reactions: Smaug's Smokey Hole
AMD is dropping AM4 in favor of AM5 which is DDR5 only
That's typically how this goes. High end and mid range product lines get updated first because there's a higher demand for those at launch from enthusiasts and professionals. Grandma isn't waiting in line outside a Best Buy for the new Athlon chips so that her email machine can run 5% faster.
 
That's typically how this goes. High end and mid range product lines get updated first because there's a higher demand for those at launch from enthusiasts and professionals. Grandma isn't waiting in line outside a Best Buy for the new Athlon chips so that her email machine can run 5% faster.
I don't get the surprise over AM5. Is there something I missed? AM4 has had support for how long now?


I think it's okay we finally move to a new socket. The backwards compatibility is really starting to stretch the bios memory on older boards.
 
I don't get the surprise over AM5. Is there something I missed? AM4 has had support for how long now?


I think it's okay we finally move to a new socket. The backwards compatibility is really starting to stretch the bios memory on older boards.
The issue is not that AMD is moving to a new socket, but that they won't offer any DDR4 boards with the new socket, and given the current prices of DDR5 modules they are basically locking out the new platform for people who have a limited budget, and it's an area where Intel may get a lot of clientele due to AMD's irresponsibility with the recent launches.
 
  • Like
Reactions: Allakazam223
It seems that AMD really likes to shoot themselves in the foot.

First they completely ignore the budget market with the release of the 3000 series.
Then Intel tears AMD a new one with the release of Alder Lake, where the i3-12100 and the i5-12400 became one of the best choices in the budget market.
Then AMD feeling the fire under their asses haphazardly releases sub-par "budget" CPU's for the AM4 platform which become a laughing stock.
And now, AMD is dropping AM4 in favor of AM5 which is DDR5 only, meaning it will be automatically disqualified from any budget setups.

What the fuck are you doing AMD?
What the fuck they're doing is making chips in Taiwan. Hence they're extremely supply constrained right now, much more so than Intel, so they're focusing more on the high-margin server-class EPYC and workstation-class Threadripper PRO, where every single one they manage to make is money grabbed directly from Intel's clutches, who hasn't had a performance-competitive product in this space since 2019. They're also selling 7003X series CPUs to cloud services as fast as they can make them, and these are chips that have a $10k price tag on them. Threadripper PRO has grabbed a 50% share of the workstation market. That's a $5000 CPU.

Budget CPUs are low margin, so you need high volume to make money. Budget CPU customers don't care about heat or flops; they just mostly want the cheapest thing. AMD can't beat Intel on a price war for low-end chips. They just can't make enough of them to do that. So it makes sense that they're picking and choosing their battles.
 
As much as I'm not a fan of intel I had several gizmos with their low end SoCs recently and they're something else, contrary to popular belief (of the 14 year old "gamers" on reddit) pretty cool chips actually and right in that sweet spot of low power consumption and just enough performance that they're useful for everything that isn't recent videogames. It's genuinely a pity they usually get stuck into very limited and shabby devices where the manufacturer shaved every cent. The higher end devices with the better batteries, screens, dual-channel memory etc. then usually have a more powerful SoC that gets hotter and drains the battery faster. I understand why there's no market for an "entry level" SoC in a mobile device with decent (read: more expensive) peripherals but it'd be interesting to see at least once because these chips rarely get to live up to their full potential. But I have a soft spot for entry-level stuff anyways. There's no art in getting the most out of a Zen $X or i7-some_lake.

I'd also be interested to see some of these low end intel devices on a smaller process.
 
Back