GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Are they making ARM for desktops? I know they are working on their horse creek platform but that uses RISCV tech from Sifive, not ARM. I cant find anything about intel-branded ARM processors being made, and ARM is going hard into the server space which is intel's bread and butter.

The Arc thing reminds me when they got into the phone space, because I was dumb enough to buy a x86 android phone which while it ran really fast had shit battery life and next to no support, and the build quality was meh at best.
I believe the closest we have to that in the consumer space is the Apple M1 Macbooks that have ARM CPU in it.
 
I believe the closest we have to that in the consumer space is the Apple M1 Macbooks that have ARM CPU in it.
They do have the Mac Mini in M1 as well.

There are a ton of random single board computers(like Raspberry Pi, but fast) out there with ARM, but not many people outside geekdom are using them as desktops. For Instance: https://arstechnica.com/gadgets/202...h-a-new-dev-kit-and-arm-native-visual-studio/

Windows now supports ARM desktops, but really your only option was Linux and Linux related stuff(ChromeOS).
 
  • Like
Reactions: The Ghost of Kviv
Are they making ARM for desktops? I know they are working on their horse creek platform but that uses RISCV tech from Sifive, not ARM. I cant find anything about intel-branded ARM processors being made, and ARM is going hard into the server space which is intel's bread and butter.
No current ARM projects publicly known. RISC-V and ARM are used in small ICs all over, though. Nvidia uses RISC-V for power control on FE cards IIRC. Intel has those deep pockets to throw billions on new projects, and some notable pockets in Washington. Even with the lead that AMD/Nvidia had, when it came to handing out gibs to silicon companies, Intel was at the top of the list. If it means Intel switches to ARM to build some new DoD project, both the American and Israeli teams can hammer out good designs when prodded.

The Arc thing reminds me when they got into the phone space, because I was dumb enough to buy a x86 android phone which while it ran really fast had shit battery life and next to no support, and the build quality was meh at best.
Atom has improved significantly since netbooks, Alder Lake's Gracemont is a good example of that.

I believe the closest we have to that in the consumer space is the Apple M1 Macbooks that have ARM CPU in it.
As @davids877 mentioned, ARM is everywhere now, Microsoft's new "dev kit" is ARM: https://learn.microsoft.com/en-us/windows/arm/dev-kit/. Chromebooks are usually ARM, along with ARM's new dedicated client-side designs, the X-series: https://en.wikipedia.org/wiki/ARM_Cortex-X1.
 
  • Informative
Reactions: Post Reply
I can say from experience that the current-gen Celerons/Atoms are really decent chips speedwise. They are pretty much able to do anything you'd want that's not gaming related. (light indie games and emulation of the older 16 bit systems are fine though) If you're not a retard and install things like ublock in your browser you'll get a snappy experience. In power consumption, they're a tossup with the high end ARMs and sometimes might even consume less.

The driver support is absolutely atrocious and pretty much all current ARM SoC manufacturers don't really care about anything besides shoveling their chips onto disposable android devices, because that's where the real money is. Almost all Linux driver support for ARM SoCs are the result of reverse engineering and usually, when some voluntary developers make some headway into a SoC, a new one comes out everyone flocks to and development is basically abandoned.

That list of fancy features the SBC seller is promising on his premium board? Forget about all of them. If you're very lucky, you might get 3D acceleration (that'll forever be subpar to even intel iGPUs or the performance of the same SoC on android) and hardware video decoding, usually not even of all the codecs the SoC supports. If you're lucky.

All the ARM desktop nerds have a serious case of stockholm syndrome. They celebrate things about their ARM SBCs as huge breakthroughs that are a complete standard not worth mentioning on every sub $100 celeron ITX board. Most of all, with x86 shit just works. The driver support is excellent and done by the actual manufacturer (even in Linux) and you're compatible to all x86 binaries. As desktop user, there's literally no reason to go with ARM outside Apple's stuff, if you can live with it.

I can't imagine intel's gonna change the game there.
 
Wasn't China messing around ARM computers for domestic and government use as a way to get away from Intel? I know Huawei created something like that a couple of years back and they're awfully tight with the government.

The ARM motherboard and CPU for desktop use.
XPfNnUo4Niow5dtWKfTCZZ-970-80.jpg
 
  • Informative
Reactions: Judge Dredd
No current ARM projects publicly known. RISC-V and ARM are used in small ICs all over, though. Nvidia uses RISC-V for power control on FE cards IIRC. Intel has those deep pockets to throw billions on new projects, and some notable pockets in Washington. Even with the lead that AMD/Nvidia had, when it came to handing out gibs to silicon companies, Intel was at the top of the list. If it means Intel switches to ARM to build some new DoD project, both the American and Israeli teams can hammer out good designs when prodded.


Atom has improved significantly since netbooks, Alder Lake's Gracemont is a good example of that.
Idk if funding RISCV its the right path for intel given that the one thing keeping them on top is their duopoly of x86, without that they wouldnt even be ARM, they would probably got bought ages ago by other companies doing better x86 silicon. And RISCV is FOSH, literally any half assed company with access to a foundry can churn out a design, just like everybody can make a SPARC clone, except the latter doesnt have even nearly as much support as RISCV has/will continue to have.

I think intel should really really focus more on GPUs and related fields just like nvidia has, and pray that x86's legacy baggage will keep the market from flipping to ARM or RISCV completely. Like I said the datacenters are already going ARM, because of energy costs.

As for atom, that phone was already over half a decade after netbooks, again not a bad chip but much like Arc they didnt bother to put enough funding nor sweeten the deal to devs to support it, and then abandoned it like google abandons every product worth caring about.
 
Wasn't China messing around ARM computers for domestic and government use as a way to get away from Intel? I know Huawei created something like that a couple of years back and they're awfully tight with the government.

The ARM motherboard and CPU for desktop use.
View attachment 3809544
China is schizo-tier paranoid about American spy circuits in desktop chips. That's why they bent over backwards to get AMD to sell them Zen 1 designs though some subsidiary pipeline from Taiwan.

Idk if funding RISCV its the right path for intel given that the one thing keeping them on top is their duopoly of x86, without that they wouldnt even be ARM, they would probably got bought ages ago by other companies doing better x86 silicon. And RISCV is FOSH, literally any half assed company with access to a foundry can churn out a design, just like everybody can make a SPARC clone, except the latter doesnt have even nearly as much support as RISCV has/will continue to have.
Intel was commanded by the USA to allow licensing x86, off the top of my head, it was i686/Pentium Pro. To combat this, Intel created Itanium, but that was ignored by everyone in favour of amd64. Intel has been fairly successful ($$$) by adding SIMD extensions to x86, but a lot of that success has been paying for dev time to building the extensions into applications.

I think intel should really really focus more on GPUs and related fields just like nvidia has, and pray that x86's legacy baggage will keep the market from flipping to ARM or RISCV completely. Like I said the datacenters are already going ARM, because of energy costs.
Being lazy has been Intel's bane the last decade, Pat Gelsinger noted that when he became CEO. Praying gave us Skylake, working gave us Alder Lake.

As for atom, that phone was already over half a decade after netbooks, again not a bad chip but much like Arc they didnt bother to put enough funding nor sweeten the deal to devs to support it, and then abandoned it like google abandons every product worth caring about.
I'm sure it was more that Google's usual schizo teams that caused that. Maybe a mix of TSMC being cheap, and being tied to Intel's roadmap?
 
I can say from experience that the current-gen Celerons/Atoms are really decent chips speedwise. They are pretty much able to do anything you'd want that's not gaming related. (light indie games and emulation of the older 16 bit systems are fine though) If you're not a retard and install things like ublock in your browser you'll get a snappy experience. In power consumption, they're a tossup with the high end ARMs and sometimes might even consume less.

The driver support is absolutely atrocious and pretty much all current ARM SoC manufacturers don't really care about anything besides shoveling their chips onto disposable android devices, because that's where the real money is. Almost all Linux driver support for ARM SoCs are the result of reverse engineering and usually, when some voluntary developers make some headway into a SoC, a new one comes out everyone flocks to and development is basically abandoned.

That list of fancy features the SBC seller is promising on his premium board? Forget about all of them. If you're very lucky, you might get 3D acceleration (that'll forever be subpar to even intel iGPUs or the performance of the same SoC on android) and hardware video decoding, usually not even of all the codecs the SoC supports. If you're lucky.

All the ARM desktop nerds have a serious case of stockholm syndrome. They celebrate things about their ARM SBCs as huge breakthroughs that are a complete standard not worth mentioning on every sub $100 celeron ITX board. Most of all, with x86 shit just works. The driver support is excellent and done by the actual manufacturer (even in Linux) and you're compatible to all x86 binaries. As desktop user, there's literally no reason to go with ARM outside Apple's stuff, if you can live with it.

I can't imagine intel's gonna change the game there.

The 8-core Atoms will be legit. The article author seems to think it will only support single-channel which is doubtful since even chips like your N4020 support dual-channel.

Adoredtv released a video talking about RDNA3:
It's all about the chiplets going forward. Also, maybe they sandbagged the reference design so that AIBs could put out 450 Watt models with higher clocks, using 4090 coolers.
 
  • Like
Reactions: The Ghost of Kviv
China is schizo-tier paranoid about American spy circuits in desktop chips. That's why they bent over backwards to get AMD to sell them Zen 1 designs though some subsidiary pipeline from Taiwan.
They could've bought VIA instead, they still got the cyrix patents.

And if china really cared about safety they would move to another architecture, instead they are making the same mistakes the soviets did cloning IBM 360s so they could use/pirate the existing software, so the CIA went and put a bunch of bugs and backdoors on it and wrecked havoc on soviet systems.
Being lazy has been Intel's bane the last decade, Pat Gelsinger noted that when he became CEO. Praying gave us Skylake, working gave us Alder Lake.
They were lazy before that too, remember netburst? The pentium M architecture basically saved them from being destroyed by the athlon64.
I'm sure it was more that Google's usual schizo teams that caused that. Maybe a mix of TSMC being cheap, and being tied to Intel's roadmap?
Google has a tendency to half ass stuff and abandon products even when there's a clear market interest and tons of hype, see the ARA phone.
 
  • Like
Reactions: The Ghost of Kviv
The 8-core Atoms will be legit. The article author seems to think it will only support single-channel which is doubtful since even chips like your N4020 support dual-channel.
Since they often end up in cheap devices where every penny is pinched, single channel is usually all they get in practice. (which together with often sub-par cooling really hits them right in the performance) I do agree though that would be a strange place to save on the chip design. Then again, they know their customers and if you do devices on a budget as an e.g. low-end notebook designer that's a very good place where you can fudge things and keep costs down, because these things are not things that go into the advertising for the device, nor will most people buying them know to look for them. If you can stick more cores into something or a higher model number, that draws in the buys. It doesn't even matter if your shitty cooling solution won't be able to sustain the theoretical speeds.

Google has a tendency to half ass stuff and abandon products even when there's a clear market interest and tons of hype, see the ARA phone.
It goes usually as follows:
1. offer an almost suspiciously good product
2. cripple that product until everyone stops using it, usually completely missing the point
3. abandon it citing lack of market interest
 
  • Informative
Reactions: Brain Problems
The 8-core Atoms will be legit. The article author seems to think it will only support single-channel which is doubtful since even chips like your N4020 support dual-channel.
Finally, the magical all-E-core I was waiting for. If it works on W680 with the full feature set, it would mean it's possible to build a power-sipping system with ECC.
 
Finally, the magical all-E-core I was waiting for. If it works on W680 with the full feature set, it would mean it's possible to build a power-sipping system with ECC.
I assume you need the Elkhart Lake equivalent to get features like ECC.

Honestly, I was initially hesitant on the 7900 XTX, but if the chiplets design proves to be viable, then I might snag it
Wait for the reviews but it certainly looks viable. It looks like it can smash the RTX 4080 16 GB in raster while losing in raytracing. And compute unless they can fix that up.

I think they got around certain issues by including only one graphics chiplet. There were rumors and patents about including two of them, and they will have to do that eventually. With chiplets, you can go far above the reticle limit.
 
  • Thunk-Provoking
Reactions: The Ghost of Kviv
Yes. And it is making me literally angry irl. Webshit developer showing off his ultra-widescreen monstrosity. 86 DPI. For Text. And he sits right in front of it.

I bought a small mobile screen with a resolution of 2560x1600 for my desktop just because I wanted something 200+ DPI. The difference in eye comfort when reading lots of text is night and day. I tested a lot of monitors around that time, for reference I am ancient as fuck but had one of my eyes lasered so I don't need glasses. The jump from 90 to 140 DPI is dramatic, and then it just sorts of falls off, 190 DPI to 230 DPI is still noticeable, 230 DPI to 270 DPI a lot less so. The general opinion is that everything 300+ has pixels so small that they can't individually be seen anymore by the human eye and sharpness won't really be distinguishable anymore. Most customer monitors are ~100 DPI (no matter the resolution, because usually the resolution grows with the size of the screen - my uneducated guess is that the 100ish value is probably the sweet spot for cost-effective panel manufacturing) which is actually really shitty if you ever used anything better. Not really super noticeable in fast-moving games and video but with vector text and static pictures it really is. Also another advantage of these high pixel densities is that you basically can live without anti-aliasing and slight scaling often won't even be noticeably blurrier than native resolution because there's more pixels to work with for interpolation.

In a sane world, 16:10/3:2 would be the default aspect ratios for gaming and work respectively and 200-300 DPI would be the absolute standard in sizes up to and including 15". 4k and such do make sense then. We do not live in a sane world. (It does look like laptop manufacturers are picking up that people who try it usually end up loving it though, a thing smartphone/tablet manufacturers knew all along)
for fonts microsoft has been tuning that shit for ages tho, so even lower DPI should still be somewhat acceptable (compared to non-smoothing back then).

tbh since I got an eboook reader I hardly use the screen anymore for serious reading.

I always try to go with it if it is an option. Memory corruption can be really evil because it can be silent and can even propagate into your backups. There's also the less known risk of corruption by faulty memory/controllers/flash chips in storage devices which is just as bad and can just as much corrupt stuff without you noticing. There's filesystems like btrfs and zfs that have inbuilt protections against that by checksumming though.

If you want ECC it's basically AMD or bust though as intel doesn't do it in consumer grade hardware. AMD does it and basically almost all consumer grade AMD boards do support it too, even if maybe sometimes not officially advertised. I'd check to make sure though, some boards "accept" ECC RAM but don't have the wiring to actually implement the ECC feature, just your processor/OS reporting that it works doesn't mean it does. The most easy and no-lasting-damage way to cause bit-flip errors is to overheat the RAM until ECC errors pop up. I have ECC RAM in my desktop. Be aware that it's by definition slower than normal RAM though.
reason I'm not so settled on ECC is that I expect it only to be an issue for me corrupting files while writing (and then your stuck with it). for most stuff I archive/store longterm I generate checksums anyway, and do a check after a copy to make sure the file itself is identical.

the only think I'm hardly able to find info about is if md5/crc32 could catch memory corruption this away, because for me one the file is written correctly that's fine, it usually only ever get read afterwards.
 
for fonts microsoft has been tuning that shit for ages tho, so even lower DPI should still be somewhat acceptable (compared to non-smoothing back then).
There's a huge history about these technologies and their patents but basically, on low DPI screens there's some cheating going on by using the subpixels (Red and Blue cells in standard alignment) of the neighboring pixels around the pixel of a letter to artificially triple the resolution for fonts. Some people, especially on monitors with poor gamma curve, can see this as a colored/rainbow fringe around letters. (This also naturally only works properly if the monitors rotation isn't changed) To also make fonts appear sharper and make them less blurry, there's also font hinting happening, basically realigning the fonts shape to fit better to these pixel grids. (and losing some of the fonts original shape and making some fonts look godawful in the process) Yeah it works but it ain't pretty. Windows Cleartype stuff is configured for shitty screens and pretty aggressive by default.

Apple goes yolo and just puts the letters on their screen however they look like without any hinting. There's also no subpixel aliasing, all Apple does is grayscale AA, giving them that soft Apple look. They can afford it because apple screens are usually all very high DPI. This makes the rendering pipeline of fonts a lot simpler and there are problems with subpixel rgb font rendering and 3D accelerated stuff sometimes, if anyone ever wondered why text seems to be really blurry on some websites and apps, this is the reason.

Linux default font rending stuff can go either way and it's up to how you configure it.

tbh since I got an eboook reader I hardly use the screen anymore for serious reading.
There's eink monitors that even go up to about 40 fps, but they cost around a thousand bucks for 13". Not having light shine directly into your eyes but be reflected off the surface you read will always be more comfortable, but I haven't been crazy enough yet to spend $1000 on such a screen and they're also not really all-rounders. Next best thing is OLED.

reason I'm not so settled on ECC is that I expect it only to be an issue for me corrupting files while writing (and then your stuck with it). for most stuff I archive/store longterm I generate checksums anyway, and do a check after a copy to make sure the file itself is identical.

This has nothing really to do with the checksumming method. The fundamental problem is, when your RAM is defective you can't trust your computer. You have your harddrive program check the checksums of your files and it reports about 4% of them as checksum mismatch. Is this now because they got corrupted, or your computer's RAM is faulty, making them appear corrupted because the data the computer shoveled into the RAM to calculate the checksum became corrupted? Or maybe all checksums match but the data doesn't because the checksumming function itself failed at some random point but returned 0? Who knows. Also a computer with corrupt RAM might just trash the filesystem that data is on, or making encrypted files unrecoverable because keys don't match etc. etc., with silently defective RAM, everything can happen. Granted a lot of these things are edge cases and rare and it's more likely than not that corrupt RAM is very noticeable, but stranger things have happened.
 
It goes usually as follows:
1. offer an almost suspiciously good product
2. cripple that product until everyone stops using it, usually completely missing the point
3. abandon it citing lack of market interest
Stadia is a good example, it actually ran "well enough" for any non-PC gamer and would have been amazing for people who can't afford a PC or a PS5 during the scarcity of the pandemic.

But they cripple it by forcing you to rebuy all your stuff instead of letting you use your steam library like nvidia does, or better yet offer something like xbox gamepass or a netflix-system, the latter would've had people waiting in line for a stadia account.

And to top it off they only launch it in really wealthy markets where anyone could pay 3x MSRP to scalpers for a 3080 or a PS5.
 
  • Informative
Reactions: Brain Problems
Back