GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

I think it's embarrassing that we went from the 10 series being loved for their efficiency to "you're getting crypto power bills no matter what"
It's a shame, it was always interesting to see what a new architecture can do when drawing power only from the PCIe slot, that's a weight class of sorts.
 
PC sales are declining again after a brief pandemic-related boom:

Traditionally, chip makers have always made the most powerful thing they could in a given package and just expected you to draw as much power from the wall as you needed. But now that gaming PCs are reaching the power draws of major home appliances, I wonder how long this can last.
 
PC sales are declining again after a brief pandemic-related boom:

Traditionally, chip makers have always made the most powerful thing they could in a given package and just expected you to draw as much power from the wall as you needed. But now that gaming PCs are reaching the power draws of major home appliances, I wonder how long this can last.
There have been some mad sales recently on last gen parts. Good buy for anyone wanting to build or upgrade.
 
I ended up with another hand-me-down device because somehow I'm apparently the electronics recycler of the family (???) (this happens when people know you collect old computers)

It's an x86 tablet, a HP Elite X2 1012 G2, besides it's truly obnoxious naming scheme it has an i5-7200U, 8 GB of RAM and a 256 GB SSD, and then a bunch of "business" features like a fingerprint reader, 4G modem, that kind of stuff. The battery runtime is truly atrocious at around 5 hours, I thought this is because the battery is nearing death but no, apparently this is normal. (It also have one of these "battery saver" features where you can leave the device plugged in all the time and it won't try to charge the battery past 80%, because you're gonna leave this thing plugged in most of the time is my guess) The screen is absolutely brilliant with ~12" 2736x1824 (3:2). Sadly it's actively cooled which is very 00s and not exactly the best feature in a mobile device. To be fair though the fans are pretty quiet and only turn on at higher load, meaning normal browsing etc. can be done in silence. (haven't tried undervolting yet and not sure it's possible, if it is there's probably headroom) Apparently there are faster CPUs in that line but in my tests with extreme loads the CPU runs into heat limitations so they're probably kinda pointless.

Linux has absolutely nothing on the HP firmware and you can't even control the fans via OS. Not exactly news as HP hates Linux. Funnily enough, the touchscreen is supported just fine and I read the wacom drawing function works too according to search engine. (can't test, there's no pen)

My main reason even talking about this here is this though:


There absolutely ought to be a law for all mobile devices to be built like this. I tried replacing a broken USB port on a samsung tablet for a family member a while ago and basically destroyed the device in the process because they glued and plastic-melted it together.

Not sure what to even do with this thing but hey!
 
There absolutely ought to be a law for all mobile devices to be built like this. I tried replacing a broken USB port on a samsung tablet for a family member a while ago and basically destroyed the device in the process because they glued and plastic-melted it together.
Absolutely agree. Not being able to switch batteries also should be rendered illegal. The greens constantly screech about reducing "MUH carbon footprint" and companies simply react by screwing over their customers, skimping on packaging quality and leaving out essential accessories, such as the fucking charger block. How about, not gluing the inernals together, allowing people to easily service their devices? Maybe, just maybe, we would generate less electronic waste, if we didn't have to throw away a few year old phone, just because the shitty battery gave out.
 
It's all but confirmed that the Radeon RX 7900 XTX will try to take on the 4090 with 24 GB of VRAM, while the 7900 XT will take on the 4080 16 GB with 20 GB of VRAM.

There is room for AMD to double Infinity Cache from 96 to 192 MB using 3D stacking, increase memory bandwidth, clocks, etc. if it wants to take on a 4090 Ti in the future.

It's an x86 tablet, a HP Elite X2 1012 G2, besides it's truly obnoxious naming scheme it has an i5-7200U, 8 GB of RAM and a 256 GB SSD, and then a bunch of "business" features like a fingerprint reader, 4G modem, that kind of stuff. The battery runtime is truly atrocious at around 5 hours, I thought this is because the battery is nearing death but no, apparently this is normal. (It also have one of these "battery saver" features where you can leave the device plugged in all the time and it won't try to charge the battery past 80%, because you're gonna leave this thing plugged in most of the time is my guess) The screen is absolutely brilliant with ~12" 2736x1824 (3:2).
I'd try to use that tablet. i5-7200U looks decent for a dual-core, faster than chips like the N4020 that are still being sold. 8 GB of RAM and 256 GB SSD are more than you get at the extreme low-end. I remember the days of 2-3 hours of battery life being normal so 5 hours is OK for me too.

Given the large screen, maybe you could keep it plugged in and hang it on a wall.
 
Not being able to switch batteries also should be rendered illegal
Interestingly enough, I kinda researched the business tablets like that one (or "convertibles" as they like to be called because you can attach a keyboard) and them being maintainable apparently is par for the course. Surprised me too. Didn't know because I never was interested in such devices.

I'd try to use that tablet. i5-7200U looks decent for a dual-core, faster than chips like the N4020 that are still being sold. 8 GB of RAM and 256 GB SSD are more than you get at the low-end. I remember the days of 2-3 hours of battery life being normal so 5 hours is OK for me too.

The 7200U with it's 15W TDP is too "big" for that small case. Sacrificing some speed and fashioning a better passive cooling solution would maybe allow it to go fanless even in a case like this but limiting the TDP envelope via intel's RAPL hits it very hard and it's soon slower than even my N4020 in benchmarks (and even subjective performance) if you do that, and yet the fans still will turn on eventually, just maybe two minutes later. While a better SoC on paper, it's age is showing in the efficiency department. This is also in best fall weather, my guess is in summer weather those fans just run permanently and won't stop the CPU from hitting hard temperature limits. I don't know about any of you but a fan in such a mobile device that's probably in front of your face is just an absolute no-go. I'd never. Not nowadays. Absolute gorgeous screen though and it would look nice on a wall.. I was low-key hoping it could replace my cheap "net"book but it can't really.

My HP Stream 11 with aforementioned Celeron is decent (really, the small celeron is enough for what I do with it) otherwise although the 1366x768 "90 degree viewing angle" TN screen is an atrocity and *will* eventually give you a headache. Funnily enough, even though it's a low-end device HP has lengthy videos up on youtube how to take it apart and to maintain it too. I was thinking about trying to replace the panel with a decent IPS panel but even after lengthy research couldn't really find anything I could say in confidence would fit.

This is generally a problem with the low-end SoCs. They fall into disadvantage. They're often decent chips if you don't want to run winbloat but a lean linux you built yourself like yours truly, often even a toss-up in power consumption and speed with high-end ARMs, (while being 100000% more compatible) but the hardware they come with is always the cheapest, most bottom of the barrel stuff that's humanly possible (too little RAM, too slow RAM, single-channel, shitty connection options, terrible screen, weirdly small batteries, slow, small soldered-on eMMC etc. etc.) pulling these chips down and make them perform much worse in the day-to-day than they would with decent peripheral hardware. Also it looks like COVID just kinda killed the low-end market. Seems like everyone is focusing on permium products.

Then on the other side you have devices with all the fixings but then usually a CPU (and sometimes even dedicated GPU) that's terribly oversized for a mobile device of that type, having you deal with loud fans, chips hitting temperature limits and performing worse, hot cases, empty batteries and laptops that are only as mobile as the extension cord allows. Then people throw their hands up and say x86 can't do mobile.

Then you have weird niche products "designed for linux" that might or might not be vaporware that are slow celerons with actually decent screens for 900 bucks.

It's like the market does not want to have decent devices.

/rant

I kinda got interested in small "convertibles" (I feel combining them with a small mech keyboard could be cool) and I might actually end up flipping this thing on ebay and looking for a small 10" device with an insanely low TDP chip. ( e.g. m7-6Y75?) I don't want to pay lots though.
 
Last edited:
It's all but confirmed that the Radeon RX 7900 XTX will try to take on the 4090 with 24 GB of VRAM, while the 7900 XT will take on the 4080 16 GB with 20 GB of VRAM.

There is room for AMD to double Infinity Cache from 96 to 192 MB using 3D stacking, increase memory bandwidth, clocks, etc. if it wants to take on a 4090 Ti in the future.
I really want this to be true, real "competition" across the entire range.
 
I just wanted to say that I enjoy your posts about low-power computing even though I'm a God Damned Burger with cheap electricity on-tap.
Thanks, I also enjoy collecting the information here. tbh, the energy costs are not necessarily always the primary concern. Energy is expensive here but more in the sense of if I had a 3080 Ti running 24/7 affecting my power bill, the theoretical difference between a 4.5W TDP chip and a 15W TDP chip .. of course doesn't really matter. It's also about minimalism and the idea to do the maximum with the minimum. There's something pleasing in having the system be "exactly enough" and getting the most out of cheap, low end hardware which usually also is low-power hardware and at least for my usage things don't really shift a whole lot over the years regarding how much more performance I need anymore as I try to keep all my systems trim, handwoven and free from bloat, so even a computer that's sub $100 used can still be useful for me. Also I really like small computers and silent computers and it's all connected to low power consumption. Can't usually really do a small, quiet computer with a 95W TDP CPU. That energy has to go somewhere.

I had no luck in finding a used 10" device and that probably would be a bit of a wild size. The next best convertible I could find that is also reasonably available in the local market is 11.2" with an m7-6Y75, 8 GB of RAM and a 256 GB SSD. It's got a decent 1080p IPS screen so that would fix my biggest complaint with my celeron, while also being a bit more of a standard screen format for media and not so high resolution as to be too hard on the small iGPU. It's the Stylistic Q616 from Fujitsu and has a weird retro charm to its' design and is also seemingly a very oddball device since there's basically zero information on it anywhere. I can only guess it's popular in Japan as this is the only video I could find:


Note the shape and thickness of the tablet, the keyboard with it's touchpad with dedicated buttons and the "neck" the screen sits on. There's some weird 80s vibe in that design. I could find out at the back basically just clips in and SSD etc. are replaceable. Price-wise, If I buy this and sell the Elite X2, I should come out with money on top.
 
  • Informative
Reactions: Brain Problems
Well taking this thread from the admirable ultra low-end using only what you need to the shamefully there's no-way-this-is-a-necessity end, AMD are currently having their live reveal event for their 7000 series / RDNA3 GPUs.

They're announcing 54% performance per watt gain which is pretty awesome and great news for mobile devices as well as our electric bills. And confirmed there's a card with 24GB. Still waiting on solid pricing... Are they going to do an Nvidia or will they try to grab market share and go lower...? Going to watch a bit more and post back. This is slightly exciting for me because I haven't upgraded my card in over six years and I've been waiting on this event to decide what to do. And whilst Nvidia has the edge currently with AI graphics I am actually interested in trying my hand at this area so I finally have a reason to upgrade.

EDIT: Display Port 2.1. Suck that Nvidia! :biggrin:

EDIT EDIT: AV1 ENCODING! MF'ing yes!

EDIT EDIT EDIT: Streaming here if anybody wants to watch:

EDIT EDIT EDIT EDIT: AMDude just showed their top card (7900XTX) running Assassin's Creed: Valhalla at 96FPS at 8K. Someone in the crowd just audibly exclaimed "Oh my God!" in response provoking laughs. Note, he wasn't recommending you play at this. Just showing off that you could. He did however announce that a range of display manufacturers are launching TVs and monitors that are 8K next year and use Display Port 2.1 and he's getting some mileage in the talk out of AMD being on the latest technology when Nvidia is not. Also a sly little dig about not needing to upgrade your cables.
 
Last edited:
A different fetch/decode speed? Old games going to break for sure.
1667508709855.png


Biggest news of the day: FSR 3
1667508778862.png

Top-end for $1000 makes this a winner!
1667508873962.png
 
I understand I should chill my tits, it's true. I am a little caught up in the excitement of this but... this really is pretty nice stuff. For a start, I wasn't at all sure we would get AV1 encoding. Not only do we have that but it appears to be pretty fast based on what they said (7x speed over CPU) and have some weird neat way of distributing encoding with the CPU which I'm going to have to look into more.

Yes, the demo might have been with FSR 3 but that in itself is also pretty cool because apparently FSR 3 has double the performance of its predecessor. I haven't used this in person yet (ancient card, like I said) but from what others have said it's actually very good. I have a very large monitor so FSR technology is a very big deal for me. I don't want it for massive and largely useless frame rates. I want it so that I can run a game full screen at normal frame rates (60fps is totally fine for me).

As to the pricing? Compared to Nvidia it's way better. Nvidia are probably going to have to drop their prices in response. Which will hurt them a lot more than AMD because they have monolithic dies which means their yields have to be a lot worse than AMD's.

$899 for the 7900 plus not needing to upgrade my PSU like I would with the Nvidia high end is doable for me. And with 20GB it's going to be a very effective performer for machine learning. Even with the AMD penalty it should be a beast with Stable Diffusion which I really want to try (no - not for porn!).

This is looking really good. I'm trying to chill my tits, I promise. But they're just so hot right now!
 

Tech jesus give his take, barring all the corpo bullshit the fact that is not a freaking overdesigned block, you dont need a new adaptor and 600 dollars less than the 4090 for like what? 15% less perfomance than the 4090? this is a huge deal

People on the chat were praising AMD because they just did not did a NVIDIA that decide to just ramp power comsuption to get numbers
 
Back