GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Tech vtubers only care about the big numbers, i personally like APUs because making cheap and functional small form factor computers is a niche that is barely touched, almost no one take seriously miniITXs
I like to balance out the high-end hardware reviews("hm, only 105fps in 1080p, this is the greatest 230 dollar disappointment in my life") with the YT channel Budget Builds. When building an ultra cheap computer with parts from eBay he's made a point of actually editing the video with that computer to get a feel for what it's actually like to use and not just to benchmark. He also stumbled upon hush-hush youtube censoring.
 
iirc that was the plan,thats why the first ARCs were stealth launched in China without much fanfare but alas the tech youtubers found them and started test inmediately
I mean, with the 'driver issues' you also have to bear in mind that Chinamen are not neccessarily playing the same games as everyone else. I don't believe CS:GO and its DirectX 3.0 or whatever it is engine is a big thing over there. So using the shitty Microsoft translation layer isn't necessarily an actual issue.
 
  • Like
Reactions: Allakazam223
There used to be a time where iGPUs couldn't reliably do even basic things like "video decoding" (I used to have an atom that had a dedicated broadcom chip for video decoding. Hot stuff! Also didn't work in Linux! Well, of course it didn't. Those were the dreaded 2.x kernel times.) or "rendering two open windows at once without stuttering". Now (many years by now actually) that they're pretty much good for anything expect higher end gaming it makes no sense from a customer standpoint to buy a low-end card, why would you go with something that's only possibly maybe somewhat better than what's already in your computer for a few hundred bucks? And if you build a small, non-high-end gaming system for the, let's say and be incredibly optimistic, 100-200 bucks a low end card would cost you could already probably get a higher trier SoC than the one you planned to get, or more RAM or whatever. I think it'll also get incredibly rare that you can buy any SoC (and that's what modern CPUs basically are) without integrated graphics besides maybe the high-end server tier, and they either don't need a GPU to begin with or run half a dozen of compute cards or something. (which they won't use for video output) If there is to be a high end enthusiast but not quite server trier that skips the iGPU to fit more or faster cores or whatever, people who buy those would probably also not combine them with a low-end GPU.

Then it's actually ok for the high end GPUs not to even try to save power anymore because the usual composite rendering and video playing in the browser can be done on the power-efficent iGPU (which no low-end GPU could ever beat in power efficiency while still be worth existing, that's just simple physics) while the dGPU does the heavy lifting. I have such a setup and my dGPU is off most of the time.

So, who's the market for low-end cards? The only user group you could market them to would be "people with old computers who also can't afford a better GPU" but that's uh, not a good group to target and even they can probably get better alternatives on the second hand market. There's just no customer for these anymore. These low-end cards will probably go the way of the dodo like sound cards, for similar reasons. Just like with sound cards, there will still always be the novelty one for niche cases made by somebody to connect a gazillion screens or something (e.g. Matrox) but for the big market they won't hold relevance.
 
Last edited:
I mean, with the 'driver issues' you also have to bear in mind that Chinamen are not neccessarily playing the same games as everyone else. I don't believe CS:GO and its DirectX 3.0 or whatever it is engine is a big thing over there. So using the shitty Microsoft translation layer isn't necessarily an actual issue.
I think Intel take on doing that was using the Chinese as beta testers because they rarely play AAA games, there is a reason the first batch of ARCs could not be bought separately you needed to buy a Prebuilt that came with it and i read somewhere that they were doing the classic "this PC runs league/fornite/overwatch" for cheap as a selling point
 
So, who's the market for low-end cards? The only user group you could market them to would be "people with old computers who also can't afford a better GPU" but that's uh, not a good group to target and even they can probably get better alternatives on the second hand market. There's just no customer for these anymore.
The main use is going to be either small form factor builds, or second hand office computers converted into gaming PCs. I've been tempted to buy a cheap £100-£200 used PC and put a 6400 in it, but never had a practical use for doing so.

Assuming the price is right, a single slot GPU with the power of a 1060 or other current popular cards could be popular as a cheap upgrade. Better than a Switch/Steam Deck, but not so expensive that normies that want a "gaming PC" will run away from it. I don't think that would happen, but I can see a use for it in the immediate future, if the costs can be kept low, which I doubt will happen these days.



On a completely different topic, I've been considering building a NAS or a home media server (not sure the exact difference) for years now but could never justify the expense. Recently, my family's increasing frustration with TV and streaming services has got me thinking about the idea again. As if on queue, YouTube has been dropping a bunch of sponsored videos about doing exactly that so I don't know if this is a common thing right now or a coincidence.
 
Last edited:
I have such a setup and my dGPU is off most of the time.
How, PRIME? I wasn't aware it was available on desktops.

On a completely different topic, I've been considering building a NAS or a home media server (not sure the exact difference) for years now but could never justify the expense.
NAS is good. It became a thing when covid first hit. Building your own is cheaper, and if you have cheap electricity, it's fine to use old parts.
 
How, PRIME? I wasn't aware it was available on desktops.
I wasn't aware it was in any usable state on linux but yeah, works fine. It's more of a "recent hardware" thing less of a specific extra technology thing. It's very useful if you want to build a small system like I did because you still get to play with a proper graphics card but if you do that rarely you don't have to live with the downsides of what a proper graphics card causes in a small case in normal desktop operation. (heat, fan noise, power consumption etc.) At least with my hardware, the GPU being D3Hot is pretty much off and I can't really measure a meaningful difference between the card being present and not present.

Assuming the price is right, a single slot GPU with the power of a 1060 or other current popular cards could be popular as a cheap upgrade. Better than a Switch/Steam Deck, but not so expensive that normies that want a "gaming PC" will run away from it. I don't think that would happen, but I can see a use for it in the immediate future, if the costs can be kept low, which I doubt will happen these days.

To be honest, I think before that happens the iGPUs will have that niche covered.
 
So, who's the market for low-end cards? The only user group you could market them to would be "people with old computers who also can't afford a better GPU" but that's uh, not a good group to target and even they can probably get better alternatives on the second hand market.
Have you looked at something like the Steam hardware survey? The 1060 is RIGHT NOW the most used card, following that in the top 5 is the 2060, 1650, 3060 and 1050Ti. The entry level or lower mid-end is a huge market.

I see it as a blessing and a curse, the entry level up to low-mid PC cards will work just fine and keep decent parity with consoles. No one will make a 4090 or 4080 game where poor people need to turn down the settings and disable shadows and other effects, instead they will make a perfectly fine 1060/1070 game but 4090 owners can go 4k, turn on more particles and ultra textures in addition to raytracing instead of using screen space reflection and cube maps.

The new cards look fucking fantastic for prosumer production though and these days everyone wants to make videos, be a twitch streamer or youtuber, then there's people buying them to make that AI hentai(there's a relevant thread in this forum). Raytracing won't be a real thing or required until it is universally supported with a decent enough frame rate on both consoles and PC, from both low to high end. Right now I'm starting to think that will happen no sooner than the PS7 and whatever thing they will call the equivalent Xbox.
At the same time mobile phone games I have never even heard of makes billions of dollars, so maybe traditional gaming is dead in a way.
 
I see it as a blessing and a curse, the entry level up to low-mid PC cards will work just fine and keep decent parity with consoles. No one will make a 4090 or 4080 game where poor people need to turn down the settings and disable shadows and other effects, instead they will make a perfectly fine 1060/1070 game but 4090 owners can go 4k, turn on more particles and ultra textures in addition to raytracing instead of using screen space reflection and cube maps.
not gonna happen since no one is doing a AAA pc exclusive anymore that would require that level of power, and any mulitplat is "held back" by consoles and parity clauses.
plus 4k for desktop is a bit of a meme (and depending who you ask so is RTX, or RT in general), since most smoothbrains don't understand resolution is related to PPI, and for most there is zero difference on a 24" screen (or have fun putting an 80" TV half a meter in front of your face).

which leads a lot unwilling or unable to pay for shit they have no need of, hence the "low-end" (relative) segment.

On a completely different topic, I've been considering building a NAS or a home media server (not sure the exact difference) for years now but could never justify the expense. Recently, my family's increasing frustration with TV and streaming services has got me thinking about the idea again. As if on queue, YouTube has been dropping a bunch of sponsored videos about doing exactly that so I don't know if this is a common thing right now or a coincidence.
think one is basically just an external drive attached to a network, the other does stuff like transcoding etc., since some do both that's probably where the overlap comes from.

procrastinating to do the same and figuring out what to use (prolly freenas, although haven't looked at it in a while) and pondering if I really need ECC ram (never had an issue with in in 20 years, kinda feels like the bitrot meme a bit overblown. you'd wanna do backups in any case anyway...).
 
Last edited:
  • Agree
Reactions: Judge Dredd
On a completely different topic, I've been considering building a NAS or a home media server (not sure the exact difference) for years now but could never justify the expense. Recently, my family's increasing frustration with TV and streaming services has got me thinking about the idea again. As if on queue, YouTube has been dropping a bunch of sponsored videos about doing exactly that so I don't know if this is a common thing right now or a coincidence.
Your tv most likely supports DLNA meaning it can find a network share and play H.264 content without a problem, that just requires a drive accessible over the network and many routers lets you plug in an USB one that you can then find as a network share to upload and delete things from. They're more secure now from what I've seen, some years back some routers opened an unprotected FTP server accessible from the internet and it could be a fun spelunking adventure looking at peoples work documents.
 
  • Informative
Reactions: Judge Dredd
Another "problem" is that IGPUs/APUs (or whatever people call them this week) are actually getting good. IGPUs can play most modern games at 1080p low or even 720p at 30-60fps.

Not great by any stretch, and tech YouTubers look down on them, but when the low end is barely providing more than what a modern CPU can, why bother?
AMD is using APU to refer to graphics-oriented models like the 5700G, and CPU/iGPU to refer to the tiny incidental graphics found in the 7950X, 7900X, 7700X, and 7600X. As far as we know, the APUs will continue to be monolithic while the CPUs use chiplets.

We might be near the point where someone can make a decent guess about Phoenix Point performance based on all the numbers AMD has thrown around for RDNA 3. I would like to see that become the debut desktop APU on AM5 as soon as possible in 2023. Maybe tie it to the launch of A620 motherboards.
 
not gonna happen since no one is doing a AAA pc exclusive anymore that would require that level of power, and any mulitplat is "held back" my consoles and parity clauses.
I mean if you want to annoy Console babies their consoles barring the Switch are mid tiers Ryzen computers, if you want their performance you can just move with whatever move a Xbox or a playstation, Microsoft noticed this that is why they are merging their PC store with the Xbox one to convert it into a affordable PC
 
The main use is going to be either small form factor builds, or second hand office computers converted into gaming PCs. I've been tempted to buy a cheap £100-£200 used PC and put a 6400 in it, but never had a practical use for doing so.

Assuming the price is right, a single slot GPU with the power of a 1060 or other current popular cards could be popular as a cheap upgrade. Better than a Switch/Steam Deck, but not so expensive that normies that want a "gaming PC" will run away from it. I don't think that would happen, but I can see a use for it in the immediate future, if the costs can be kept low, which I doubt will happen these days
That is basically what I did, used an older single slot GPU to upgrade an office PC and have a Dell SFF as a living room multimedia PC with decent gaming performance. It's about the size of a VCR and sits nicely in my TV stand. If a decent new GPU was available I would have got one so guess I'm part of that potential market?

On the second point I'm pretty annoyed at the manufacturers because we already have good options but they are not consumer grade. Nvidia have the RTX A2000 which is small profile, fed entirely through the PCIe slot and is about the same as a 3050 performance wise. They are not releasing a consumer variant though so your only option is eBay for a second hand one that was cannibalised from an office PC.
 
smoothbrains don't understand resolution is related to PPI
Yes. And it is making me literally angry irl. Webshit developer showing off his ultra-widescreen monstrosity. 86 DPI. For Text. And he sits right in front of it.

I bought a small mobile screen with a resolution of 2560x1600 for my desktop just because I wanted something 200+ DPI. The difference in eye comfort when reading lots of text is night and day. I tested a lot of monitors around that time, for reference I am ancient as fuck but had one of my eyes lasered so I don't need glasses. The jump from 90 to 140 DPI is dramatic, and then it just sorts of falls off, 190 DPI to 230 DPI is still noticeable, 230 DPI to 270 DPI a lot less so. The general opinion is that everything 300+ has pixels so small that they can't individually be seen anymore by the human eye and sharpness won't really be distinguishable anymore. Most customer monitors are ~100 DPI (no matter the resolution, because usually the resolution grows with the size of the screen - my uneducated guess is that the 100ish value is probably the sweet spot for cost-effective panel manufacturing) which is actually really shitty if you ever used anything better. Not really super noticeable in fast-moving games and video but with vector text and static pictures it really is. Also another advantage of these high pixel densities is that you basically can live without anti-aliasing and slight scaling often won't even be noticeably blurrier than native resolution because there's more pixels to work with for interpolation.

In a sane world, 16:10/3:2 would be the default aspect ratios for gaming and work respectively and 200-300 DPI would be the absolute standard in sizes up to and including 15". 4k and such do make sense then. We do not live in a sane world. (It does look like laptop manufacturers are picking up that people who try it usually end up loving it though, a thing smartphone/tablet manufacturers knew all along)

I really need ECC ram

I always try to go with it if it is an option. Memory corruption can be really evil because it can be silent and can even propagate into your backups. There's also the less known risk of corruption by faulty memory/controllers/flash chips in storage devices which is just as bad and can just as much corrupt stuff without you noticing. There's filesystems like btrfs and zfs that have inbuilt protections against that by checksumming though.

If you want ECC it's basically AMD or bust though as intel doesn't do it in consumer grade hardware. AMD does it and basically almost all consumer grade AMD boards do support it too, even if maybe sometimes not officially advertised. I'd check to make sure though, some boards "accept" ECC RAM but don't have the wiring to actually implement the ECC feature, just your processor/OS reporting that it works doesn't mean it does. The most easy and no-lasting-damage way to cause bit-flip errors is to overheat the RAM until ECC errors pop up. I have ECC RAM in my desktop. Be aware that it's by definition slower than normal RAM though.
 
  • Informative
Reactions: Brain Problems
I barely play games now and only older ones. The only reason I would buy a high-end card is to try my hand at Stable Diffusion and other AI stuff. So it is nice to see AMD increasing VRAM, if only to force Nvidia to increase VRAM above 24 GB later. I expected AMD to stay at 16 GB, maybe allowing a 32 GB AIB model to exist.
Wouldnt be cheaper to use cloud services for that? how much AI "art"/pr0n are you gonna make that it will be cheaper than paying for a service?
There were rumors about Nvidia stopping 3000 series production to create artificial scarcity which would drive the prices up and make them the new normal. The MSRP(RIP) for 3000 series cards might have been a little too good considering their performance.
You know things are bad when a card's price not being a complete ripoff means its "too good".
Bad drivers. When it performs well, it's as good as a 3070/6800 XT; and in some games it's slower than a 6600. Chips are a multi-year investment, and Intel was greedy as usual and wanted everything. They should've stayed in the datacenter for a few years, and consolidated their designs from AI/ML to desktop IGP before making a consumer card.
Intel being greedy is to be expected, as for drivers I didnt expect them to fumble in that aspect of all things. Frankly their whole GPU effort seems a bit half-assed like its not getting the funding it would need to actually get things done, and with microsoft ramping up on ARM intel should seriously focus on new sources of revenue like GPUs because x86, their meal ticket, its not gonna last.
Part of why AV1 is moving faster with hardware encoding is probably that it isn't mired in licensing issues like h.264. When it comes to encoding Bink continued to be used in vidya because it was a flat fee and not a fee per disc pressed(as in manufactured, not sold). I think that was the reason why FF13 on Xbox 360 used Bink, it was three DVDs and not one blu-ray.
The 360 didn't have bluray, it used regular DVDs. Had an addon for HDDVD for a while but nobody used that format anyway.
 
Wouldnt be cheaper to use cloud services for that? how much AI "art"/pr0n are you gonna make that it will be cheaper than paying for a service?
You can't use the model you want for the managed services, and it's not that easy to setup your own compute on the cloud if it's the first time you're doing cloud work. And cloud services have AUPs too, if they get a whiff of anything they don't like, the account gets sent to the naughty room.
Frankly their whole GPU effort seems a bit half-assed like its not getting the funding it would need to actually get things done, and with microsoft ramping up on ARM intel should seriously focus on new sources of revenue like GPUs because x86, their meal ticket, its not gonna last.
x86 isn't going anywhere, yet. Even in the worst case, Intel is an ARM licensee with deep pockets.
 
  • Thunk-Provoking
Reactions: cybertoaster
ARM won't replace x86, I simply don't believe it. Haven't believed it many years ago when I still believed in ARM being good, still don't do so now.

Stable Diffusion and other AI stuff
It's pretty much only stable diffusion and some of the smaller stuff like image scaling. If you'd want to do stuff like text interference, with current available models these cards would still be way too small and slow to do anything interesting. Also AMDs ML software support is notoriously poor. If one could get their hands on google's TPUs tho............

I bit the bullet and actually bought a Q616 out of curiosity and in the hopes of replacing my low end celeron netbook. The used market of the low end devices is kinda strange. My celeron goes for the same price used as the Q616, even though the Q616 has (at least on paper, we'll see soon) a faster CPU, 2x the RAM, 4x the storage (which is also a normal SSD vs. my netbook with soldered-n eMMC) a much better screen and a lot of other quality features like touch screen and wacom function etc.. Well, it's also new price of $1499 vs. ~$200, so I guess no wonder the Q616 is the fancier device. It's basically just a much older device than the celeron, but actually better. The used lower end market is a lot like this. Tons of very different devices basically going for the same price. (only exception is the HP I'll try to flip, looks like I could get about 300 bucks for it, which is about double what these other two devices are worth)

Fujitsu makes a newer model with an N5000 which largely stayed the same otherwise and also costs 1000+. I guess the high end features, low end SoC devices do exist after all, but only in the enterprise and business market, not really targeted at normal consumers. The back of the Q616 pops off, like with old phones and then you get access to all parts including battery. No screwdriver required. If the battery is in bad condition I might be able to refresh it with new cells, it's a bit of a bother though. (a lot of battery packs like that brick themselves if you remove power from the controller completely, so you have to connect the new cells in while the old ones are still connected too, which means you have to charge the cells first to be about the same voltage as the existing ones otherwise they'll start trying to charge each other and things can get heated very quickly)

(Also if you're bored, look at recent notebooks, convertibles etc. of japanese companies. Many of them look like straight out of the 90s design-wise)
 
Last edited:
That's why it used three dvds instead of one blu-ray like on PS3.
I know but it wasnt about licenses, its because bluray drives were really expensive back then, same reason why the Dreamcast didnt have a DVD drive either.

Sony could put that tech on their consoles because they owned it and if they could recoup their loses by using the PS to push those formats.
You can't use the model you want for the managed services, and it's not that easy to setup your own compute on the cloud if it's the first time you're doing cloud work. And cloud services have AUPs too, if they get a whiff of anything they don't like, the account gets sent to the naughty room.
Ok got it.
x86 isn't going anywhere, yet. Even in the worst case, Intel is an ARM licensee with deep pockets.
Are they making ARM for desktops? I know they are working on their horse creek platform but that uses RISCV tech from Sifive, not ARM. I cant find anything about intel-branded ARM processors being made, and ARM is going hard into the server space which is intel's bread and butter.

The Arc thing reminds me when they got into the phone space, because I was dumb enough to buy a x86 android phone which while it ran really fast had shit battery life and next to no support, and the build quality was meh at best.
 
I know but it wasnt about licenses, its because bluray drives were really expensive back then, same reason why the Dreamcast didnt have a DVD drive either.
Sony could put that tech on their consoles because they owned it and if they could recoup their loses by using the PS to push those formats.
I'm not talking about the drive, it's about the video format used in the two versions of that game. People have long wondered why game devs continues to use Bink and it's because of the cost associated with it vs h.264.
 
Back