GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

@AmpleApricots had an older VIA/S3 system, right?
"Newest" VIA I have is an EPIA M board with a VIA C3 Chip. That's from 2001 and uh, not state of the art anymore, though the newest ones are more distant relatives of the VIA Nano and are already made by the chinese. The russians have their own CPU too, although if that thing exists anywhere except on paper is the question. (AFAIK the linux kernel has support for it though, or at least had at some point, I didn't follow it) Some security researcher stumbled over undocumented instructions in that VIA C3 CPU that give you ring 0 access no matter where they're ran from, bypassing everything. Video's up somewhere on youtube. Making some connections from the C3 to that very distant chinese relative is probably moot though. I'll leave the implications of China making a CPU for the domestic market with Taiwan's help up to you, too.

Driver support, who knows? These authoritarian governments come out of the woodworks every few years claiming they'd had enough of western imperialism and they'll make their own electronics (with blackjack and hookers) then some engineering samples show up that are less than impressing and then you'll never really hear from them again until the next one. (Not saying they're never actually made, they just never leave their domestic market, if, you know, they're actually made) I'd be surprised if AMD and intel chips aren't bugged though. The fears they have are grounded in reality IMHO. (You'll never hear of it because NSA/CIA is not gonna give public knowledge to hardware backdoors to e.g. catch a few pedos)

I also have a Cyrix MediaGX. Small industrial SBC board, slightly larger than a Pi and passively cooled with an onboard slot for a CF-Card. Speed is around mid-range Pentium MMX and it has VGA and Sound Blaster/OPL3 support. The core is a die-shrinked and slightly improved version of the Cyrix 5x86. It has a hidden processor flag with which you can activate branch prediction, first time I saw that in a CPU. Makes it faster but can be slightly buggy with 32 bit code. It's a very cool DOS Machine. After Cyrix, this design went first to National Semiconductor which produced and slightly improved the GX for a bit, until later going to AMD which die shrinked it a few times and created a few successors with different CPU cores for the use in thin clients and such, becoming the base for the AMD SoCs we now all know and love.
 
I also have a Cyrix MediaGX. Small industrial SBC board, slightly larger than a Pi and passively cooled with an onboard slot for a CF-Card. Speed is around mid-range Pentium MMX and it has VGA and Sound Blaster/OPL3 support. The core is a die-shrinked and slightly improved version of the Cyrix 5x86. It has a hidden processor flag with which you can activate branch prediction, first time I saw that in a CPU. Makes it faster but can be slightly buggy with 32 bit code. It's a very cool DOS Machine.
How's it on Win 3.11 compatibility? I want to rebuild a pure DOS machine so bad but my life situation absolutely demands my tech stuff take less space and a bulky beige box 486 is really hard to justify.
 
  • Like
Reactions: Brain Problems
So what's the difference between 32GB and 64GB of ram? How significant is the change and which is worth pursuing, I'm kinda tired of 8GB/16GB shit.
 
  • Thunk-Provoking
Reactions: Brain Problems
So what's the difference between 32GB and 64GB of ram? How significant is the change and which is worth pursuing, I'm kinda tired of 8GB/16GB shit.
I have 32 and i imagine that 64 is somewhat overkill for the foreseeable future. Unless you already know you would get use out of that much ram.
With 16 i needed to babysit my browser and make sure to regularly check that there isnt some retarded monkeycoded youtube or twitch tab open consuming several gigabytes of ram.
With 32 i can mostly just ignore that. The only time i ever ran out of ram on 32 was when i played a game with an autistic mod-loader that requires every one of my 300+ installed mods to be loaded into memory at the same time while Chrome felt like wasting several gigabytes for a few 10 min videos and a livestream on top of that.
32 is also nice if you are running something like a 3D modeling tool, a 3D texuring tool and a game engine all at the same time because you are a lazy bum that does not feel like starting each one after another.
All in all i like 32 and would not want to go back, but im not sure if i would really get any benefit from 64 outside of one or two extreme cases.

If you have 16 and regularly run out of ram then i would go for 32. That will make that problem go away outside of the most extreme situations.
64 only if you dont care about the extra cost or are planning to do/play something really, really crazy. I am kinda wasteful with my ram and even i would not really get much use out of that.
Both 32 and 64 only makes sense though if you generally keep a lot of stuff running at the same time or want to go beyond normal levels of modding.
More than 16 is obviously not really going to get you any extra FPS or something in an un-modded game that needs only 8gb.
Really depends on how you use your computer at the end of the day.

Also keep in mind that if its DDR4, that is pretty much EOL so 64GB is a (somewhat) big investment that cant be used with future systems.
If its DDR5 i would get 32 and then see later on (when prices have dropped) if i really need twice that much.
 
How's it on Win 3.11 compatibility? I want to rebuild a pure DOS machine so bad but my life situation absolutely demands my tech stuff take less space and a bulky beige box 486 is really hard to justify.
You might wanna look into industrial boards in general, some of them are like my MediaGX, others look like expansion cards (to be plugged into a passive backplane; basically just a PCB with a bunch of slots) usually they don't need that backplane and can just be powered like that, if they have onboard graphics they're complete computers. They have usually very thick PCBs (to make all that routing possible, I have e.g. 386 and 486 boards I took home from work back in the day, they're no larger than the smaller ISA cards, yet CPU and FPU are still socketed on many) and very rugged and well built (I could clock that 486 to insane speeds)

You also might wanna look into old thinclients, then you get a tiny computer with case, power supply and everything at the ready. Many are designed to boot via network, but they usually also have at least one internal IDE slot and with modern flash memory it's not hard to get plenty of storage into that tiny case. Many carried chips like that MediaGX. Another one to look out for is the Vortex86. (although that one is very rare) Compatibility of the MediaGX is superb, shit just works. Surprisingly authentic OPL3 Synth. Sound works without any drivers in DOS. Graphics and audio drivers exist from 3.11 to I think WinXP, might have to dig a little though. Early 9x stuff runs still well on it though, for example Fallout 1&2 play fine.

When checking ebay, aim for the National Semiconductor produced ones. The Cyrix branded ones are rare, the later AMD ones are different chips really. I'd check out Wikipedia to make sure what I'm actually buying. There are many different generations with different speeds but the later chips drop all DOS compatibility features. The Cyrix ones also exist as socketed, cermaic packages. These will not work on normal boards, they need the MediaGX chipset.
 
Last edited:
Doubleposting but this thread isn't moving that fast so I'll just do it. I got the Q616, from a commercial seller used. To preface, these buys are low risk here because germany doesn't have "as-is" in that form for private persons buying from a business. You cannot waive your customer rights as private buyer, so the seller has to give the warranty defined by law, even on a used item. On top of that, there's a law that allows you to send everything back you bought online from a commercial seller inside of 14 days of getting the item to receive a refund. Kinda like amazon, just every business is obliged by law to honor it. This makes used item sales kinda risky for businesses but the few businesses that specialize on it tend to not sell complete junk as a result. It's amazing for a private buyer because you can make some good deals on used stuff without risking getting stuck with complete garbage some tweaker found in the dumpster.

My first impression is very good, the m7 is faster than both the Celeron N4020 and the i5 7200u. It's not faster than the latter in theory, it just has more thermal headroom to run past it in practice. That the HP has active cooling really was the first sign of trouble here, it's just impossible to cool such a TDP properly in the case layout I saw online. In synthetic benchmarks the m7 lands roughly in the areas of the i5 just a bit slower (but eventually overtakes it as the i5 heats up) and is dramatically about almost as double as fast as the celeron. I blame the shitty RAM and cooling solution the celeron probably has. Most of all, it also *feels* much faster which probably is in no small part also because of proper Samsung SSD vs. onboard eMMC.

Power consumption is about 4-5W idle, 11-14W under heavier load depending what exactly the load is, this is with a medium-bright screen. In practice when browsing, you end up somewhere in between. Mathematically, I think the 10 hours Fujitsu claims are probably a bit optimistic but 6-8 hours should be in it if you run a slim Linux and don't compile the entire time. Cleverly, it charges the battery to 100%, then lets it drop to 90% and refuses to charge it further until it drops under 90% which will only happen if you disconnect it from AC. This will keep the battery healthy. The device is in very good condition, one barely noticeable scratch at the bottom, the battery reports 13 charging circles so I guess they replaced it before selling with a more viable one from some stock. (see first paragraph) It also came with a docking cradle which basically turns it into a tiny desktop and which wasn't advertised. The keyboard keys are nice and shiny from previous use. The screen is slightly anti-reflective, not quite mirror-y but also not matte.

Now the big Q: Is it worth keeping for 150 bucks? I don't know. It is an old device after all and I only bought it out of curiosity about that one CPU that's lowest on the list of all intel x86 TDPs. You could get something newer for some more money, around ~400-500ish bucks you can already get a cheap china 3:2 oled with a celeron (The Chuwi Freebook) (I mixed this up with a newer device they plan to put out, the price became more and more suspicious the more I thought about it) but that has finest chinese built quality, with a glossy screen that isn't even scratch resistant so you'll probably run into the same shortcuts you run into all these low end devices. Some of the newer devices in this price class also already feel flimsy from their pictures, this Q616 feels robust. These crazy Japanese even put a cover over the SD card slot, can you believe it? Then again, old. Decision, decisions.
 
Last edited:
It's pretty nutty to launch GPUs that are a negative value proposition compared to their previous generation, but this is nVidia after all.
For what i can see AMD dabbed on them by not releasing first and refining the new radeon, the only thing that AMD had to do is to make appealing a upgrade for a good price, instead nvidia derped and released something that is just a 3090ti v2, how they are going to justify a 4070 with those numbers?

The only thing AMD has to do is to keep their promises of the 7900 being slightly better than the 4080 with that price and Nvidia will start to reconsider releasing overpriced pieces of garbages that explode if you look at them wrong
 
  • Informative
Reactions: Brain Problems
The benchmarks of the 4080 are so disappointing, 1200 bucks for a card that barely get like 20 frames more or so compared to a 3090 that is 800-900 on retail
Rumor is that NVIDIA was expecting miners to snap all these up again, hence so much die space being spent on tensor cores, and driving that flop per watt as high they could go without starting a mini-supernova.
 
Small review collection for the 4080. Overall reception is along the lines of "Decent GPU but shit value".
Pretty much seems like its price is supposed to make the other (also expensive) cards seem better value by comparison. Including the 4090 lol.
I also sure wonder who exactly this card is supposed to appeal to.
Id imagine the kinda folks that still can/are willing to shell out 1400$+ for a bad value high end GPU would instead just get a 4090.

Gamers Nexus

Hardware Unboxed

Paul's Hardware

Moore's Law is Dead

Linus Tech Tips

Daniel Owen (40 Minutes of Benchmarks against 3080, 3090ti and 4090)
 
  • Informative
Reactions: Brain Problems
Doubleposting but this thread isn't moving that fast so I'll just do it. I got the Q616, from a commercial seller used. To preface, these buys are low risk here because germany doesn't have "as-is" in that form for private persons buying from a business. You cannot waive your customer rights as private buyer, so the seller has to give the warranty defined by law, even on a used item. On top of that, there's a law that allows you to send everything back you bought online from a commercial seller inside of 14 days of getting the item to receive a refund. Kinda like amazon, just every business is obliged by law to honor it. This makes used item sales kinda risky for businesses but the few businesses that specialize on it tend to not sell complete junk as a result. It's amazing for a private buyer because you can make some good deals on used stuff without risking getting stuck with complete garbage some tweaker found in the dumpster.

My first impression is very good, the m7 is faster than both the Celeron N4020 and the i5 7200u. It's not faster than the latter in theory, it just has more thermal headroom to run past it in practice. That the HP has active cooling really was the first sign of trouble here, it's just impossible to cool such a TDP properly in the case layout I saw online. In synthetic benchmarks the m7 lands roughly in the areas of the i5 just a bit slower (but eventually overtakes it as the i5 heats up) and is dramatically about almost as double as fast as the celeron. I blame the shitty RAM and cooling solution the celeron probably has. Most of all, it also *feels* much faster which probably is in no small part also because of proper Samsung SSD vs. onboard eMMC.

Power consumption is about 4-5W idle, 11-14W under heavier load depending what exactly the load is, this is with a medium-bright screen. In practice when browsing, you end up somewhere in between. Mathematically, I think the 10 hours Fujitsu claims are probably a bit optimistic but 6-8 hours should be in it if you run a slim Linux and don't compile the entire time. Cleverly, it charges the battery to 100%, then lets it drop to 90% and refuses to charge it further until it drops under 90% which will only happen if you disconnect it from AC. This will keep the battery healthy. The device is in very good condition, one barely noticeable scratch at the bottom, the battery reports 13 charging circles so I guess they replaced it before selling with a more viable one from some stock. (see first paragraph) It also came with a docking cradle which basically turns it into a tiny desktop and which wasn't advertised. The keyboard keys are nice and shiny from previous use. The screen is slightly anti-reflective, not quite mirror-y but also not matte.

Now the big Q: Is it worth keeping for 150 bucks? I don't know. It is an old device after all and I only bought it out of curiosity about that one CPU that's lowest on the list of all intel x86 TDPs. You could get something newer for some more money, around ~400-500ish bucks you can already get a cheap china 3:2 oled with a celeron (The Chuwi Freebook) (I mixed this up with a newer device they plan to put out, the price became more and more suspicious the more I thought about it) but that has finest chinese built quality, with a glossy screen that isn't even scratch resistant so you'll probably run into the same shortcuts you run into all these low end devices. Some of the newer devices in this price class also already feel flimsy from their pictures, this Q616 feels robust. These crazy Japanese even put a cover over the SD card slot, can you believe it? Then again, old. Decision, decisions.
Somewhat intrigued by your post about the low end CPU market, I snagged a cheap sub $100 Asus laptop with a N4050 and popped in a NVME drive. The little ripper rips for basic things. Emmc is forgone with Linux due to some driver issue, but I wasn't intending to use it anyway. It barely gets warm under load and can run a Win 7 virtual machine, which is what I need it for, quite well. It's crazy how much computer you can get for so little these days. The thing even has an IPS screen albeit only 720p.
 
Emmc is forgone with Linux due to some driver issue, but I wasn't intending to use it anyway.
Probably either the way it's talked to or the way it implements power saving modes. I got the impression in these cheap devices that the engineers like to go wild with GPIOs (yes, your "fully-grown" Intel/AMD SoC has GPIOs just like a Pi, they're just usually either used for board-internal things by firmware like letting you switch things on and off in the bios or just not wired out) to control them, something often quite custom the linux drivers of course know nothing about and never will as no developer gonna sit down and implement it for some cheap celeron netbook. As good as hardware support has gotten for Linux in recent years, if you buy mobile devices like this you can quickly end up in no man's land regarding specific features if you get unlucky.

I decided to keep the Fujitsu Q616 as the thing has a robustness and kind of old-school-y feel about it you don't find in other devices today. It's also well built and one of the few mobile x86 devices I know of that actually comes with appropriate passive cooling. Under continuous load, the CPU hits at most 50-55C and immediately also drops ~10C as soon as that load is removed. Barely gets warm on the case. This is often an overseen thing in these x86 mobile devices, especially in the push to make them as thin, light and elegant as their ARM brethren. (which this one doesn't even try to be) Some of them advertise CPUs that are fast in theory but in practice never will be able to reach their top speeds because of inadequate cooling, being slower than devices with slower CPUs. They might get up to higher bursts for short times but depending on how you use them that might not make much of a difference. The heat they sit at is also often hell on the batteries (Li-Ion doesn't like heat at all, especially while charging or sitting at high voltage) and my theory why some devices are just notorious for failing batteries, as on the inside they're all using the same cells anyways.

The 1080p screen at 11.6" isn't quite HiDPI and my biggest critique (16:9 is kinda the pits in general and I fell absolutely in love with that HP screen) but it's serviceable and still quite high-density and generally a good screen with good color coverage, contrast, etc.. so really nothing too shabby. Interestingly, the SD card reader is not an USB device but connected via PCIe and quite snappy with one of the good cards I have as a result. (one of the small details that shows that this was originally a ~$1.5k device) I also got a 4G modem I already got working and plan to get a SIM card for so I can do some SMS scripting and the gigabit ethernet adapter is actually in the device, the ethernet port in the cradle is just wiring it out.

The bios hasn't locked undervolting so I've been doing that amongst other tweaking. I've seem to have won the silicon lottery since I could push it quite far before becoming unstable. The result is that I managed to push the device down to 2.5W idle when the screen is off, which is in the Rasperry Pi 4 range if google is to be trusted. (the probably better PCB layout and used components play a role, as does my slim linux setup with very few background tasks. I used to have a Pi 2, I was NOT impressed with the build quality for the price) Playing back video with hardware acceleration and/or normal browsing with script blocking now puts it somewhere at 5-6W with the screen at 60% brightness which might be a bit more than a Pi, DOSBox emulation (playing System Shock) puts it at 9W average. To make the comparsion fair in regards to the Pi, you'd need to substract the 2-4W that go to the screen (the Pi's measurements you find online are all without screen) and are included in all my figures here as the intel chip does fancy things with the display refresh like framebuffer compression, self-refresh and partial refresh which makes it kinda hard to get a constant figure from the screen itself. I got curious and actually turned the screen off while running around blindly in System Shock as increased difficulty mode, power consumption dropped to 5W.

Peak is still at around 11-14W but that happens rarely outside of benchmarks. When it comes down to it, the unit sits in a power consumption bracket very close or the same to ARM boards like the Pi. While doing that, it also comes with a case with appropriate cooling solution, a battery, a keyboard, a cradle (with USB 3.0 ports and both a displayport and a VGA port, the unit itself has a mini-HDMI port), an inbuilt high-quality screen, 8 GB of RAM and 256 GB of fast, replaceable SSD, WLAN/Bluetooth/4G not to mention extras like an inbuilt wacom tablet with 2048 pressure levels, a touch screen, a card reader, a GPU that's actually doing the full OpenGL spec. and is software-supported for all decoding algorithms it offers, excellent hardware support that won't be dropped anytime soon by Linux and compatibility to common software. How people still like these expensive ARM boards while shitting on used computers like this is beyond me. I don't know current prices but I'm pretty sure if you buy a Pi 4 with hardware peripherals you're already way past 150 bucks. It's just not good value. Also, buying used devices like this is better for the environment than buying new and even if they consume a bit more than newer devices, you overall end up saving the planet energy. (You'd often have to use the slightly less power consuming devices for decades to make them worth it in their energy cost re: production, transport etc.) Speed-wise, they're very different hardware platforms and difficult to compare but according to 7z benchmarks I could find for the Pi 4 they put the Core M at roughly double the speed with it's two cores and HT disabled vs. the four hardware cores of the Pi. (enabling HT here could actually help for once since it would improve throughput, I just couldn't be arsed) Since a lot of programs are still heavily single-threaded like aforementioned emulation, this would probably make the Core M a lot faster in practice.

I could find only little info about the Q616 besides that it was first released in Japan and apparently is more targeted at bulk-buying enterprise buyers, not end users. It's also marketed as "semi-rugged" device, fujitsu also makes a full-blown rugged version with water resistance and all the bells and whistles. Fujitsu has quite the history when it comes to tablet computers that goes all the way to the 90s, so my guess is they know their stuff. Really like the build quality on this one. Really also like the idea of convertibles since I hate notebook keyboards.

EDIT: Because this post is a fervent advert for using "outdated" computers, I do want to point out a downside - the lack of some new technologies. For example, the HD 515 GPU in my chip isn't really VP9/HEVC capable yet. (it can do HEVC *somewhat*, but not really the interesting profiles) The HP Celeron Netbook I have might be a piece of shit and much slower otherwise, but it can do such. In 1080p video (the only resolution that makes sense on the screen) it comes down to a difference of about 1W in decoding between a "supported" H264 video in hardware and a H265 10 bit video in software. This is still much more powersaving than my desktop's iGPU who just decides to ramp up by 10W to do either in hardware. It's still a thing to be considered and would have an effect on battery life time in my case, even if minor. The impact would also be much larger if I'd want to e.g. watch a 4k H265 10 bit video, the N4020 would fare better there.
 
Last edited:
Apparently the main issue with Nvidia cables is that the big brain engineers forgot to add a way to tell the user when it was actually plugged completely to the card, if the cable is not totally inserted then it melt like butter in a frying pan, when tech jesus did a demo on how to do it the cable SOMETIMES do a click that tell you its full inserted and sometimes even if its is full inserted it doesnt click so there is no a actual way to know unless you do it several times until you hear the click, worst of it in the FE editions you have to push it very hard to make it click
 
  • Horrifying
Reactions: Brain Problems
Back