GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Any calculator nerds here? I was looking to get a nice graphing calculator, as my trusty Ti-84 Plus met a bad fate. But it has been probably 20 years since I've even looked into this stuff and now I see that they have "AI" built into them and all sorts of features. The full-color screens are definitely nifty. But I'm sort of lost. Any of you have opinions on calculators?
I think the last calculators I owned were TI-84+ (Silver Edition?), TI-89 Titanium, and TI-Nspire, and I haven't touched any of them in years. I recall Nspire had some new functions in TI-BASIC that made it much easier to build a sprite-based game. The newer TI-84 calculators with color screens have increased resolution (beyond the good ol' 96x64) and more memory/storage, but have clearly not kept up with the pace of technology. They print money with these monopolistic turds.

I remember hearing about this dustup but not the jailbreak. I haven't followed the scene for a long time:
On 20 May 2020, Texas Instruments revealed that support of assembly and C programming would be removed starting in OS version 5.5.1 for the TI-84 Plus CE and TI-83 Premium CE. 4 months later, a jailbreak called arTIfiCE, which exploits the application Cabri Jr. to run arbitrary code, was written that restored compatibility.

It's sound advice to run away from TI and examine any alternative, such as HP, or a fucking phone, and I bet there's some crowdfunded hacker-friendly alternative out there using an ESP32 or something. I remember seeing a discounted TI-84 variant at a b&m retailer, maybe 50-75% off the usual price. That could be found at stores trying to get rid of them, or outside of the back-to-school or exam seasons when they are usually bought, but I haven't paid much attention to it. I don't know that I would pay $38 for the latest TI-84 color graphing calculator when TI is trying to nerf them and I could put that money towards any other more useful hardware.
 
Last edited:
Ironically it'd be a much safer connector than 12VHPWR
watch it melt anyway :story:

somewhat related, but since there is no thread for it: anyone has some insight into dumb TVs, or how far you can neuter a smart tv (as in options that go beyond "don't hook up the wifi" or "rip out the antenna").
 
  • Optimistic
Reactions: Betonhaus
Intel Arc B580 GPU has TDP lower than 225W and PCIe 5.0×8 interface
AMD Threadripper “Shimada Peak” 96-Core & 16-Core CPUs Based On Zen 5 Architecture Spotted

somewhat related, but since there is no thread for it: anyone has some insight into dumb TVs, or how far you can neuter a smart tv (as in options that go beyond "don't hook up the wifi" or "rip out the antenna").
Unless you are schizophrenic about the smart TV connecting to a cellular network or unprotected Wi-Fi, skipping the Wi-Fi in the setup process and hooking up your own computer is all that matters. That is the insight. I've hooked up a cheap Intel quad-core system to one to handle all TV duties, which for me means YouTube, pirate streaming sites, etc.

Gayming monitors are the dumbass (you!) alternative (as far as I know, maybe they are sneaking those smart features in), and they can get large enough to be considered a TV, e.g. Samsung Odyssey Neo G7 43".

Edit: The good, the bad, and the ugly behind the push for more smart displays (archive)
 
Last edited:
It's rather amusing to see people discussing Trump's tariffs, their repercussions on prices, and Americans stacking and hoarding components, just for me as an European to go and check out the 98000X3D and this kind of price being the only thing available for me.
1732368736048.png
 
It's rather amusing to see people discussing Trump's tariffs, their repercussions on prices, and Americans stacking and hoarding components, just for me as an European to go and check out the 98000X3D and this kind of price being the only thing available for me.
View attachment 6677017
Like I said, if the tech distribution channel is sending them to the American market as fast as they can to beat the supposed tariffs, Europe gets fucked yet again. Temporarily at least. God Bless America.

AMD Ryzen 9 9950X3D & Ryzen 9 9900X3D 3D V-Cache CPUs Launching In Late January 2025
 
Last edited:
A friend of mine (not the one that told me to buy a prebuilt desktop PC) sent me this after telling him I’m now in the stages of deciding which GPU to go with.View attachment 6675592
I can only ponder in horror over the possibility (or eventuality) of future GPUs needing their own power supply separate from the one provided by the motherboard.
We're so back, baby!
voodoo.jpg3dfx_voodoo_5_6000_original.jpeg
 
I remember hearing about this dustup but not the jailbreak. I haven't followed the scene for a long time:
https://en.wikipedia.org/wiki/TI-84_Plus_series#Programming
I know this goes off topic but what the fuck TI? Assembly programming was the only way to get good software running on the thing. Hell, I remember with the TI-83+ et al. them openly touting the new functionality to make asm easier to access. z80 assembly on the TI-84 was one of my first experiences with really low-level programming.

What the fuck were they thinking
 
I know this goes off topic but what the fuck TI? Assembly programming was the only way to get good software running on the thing. Hell, I remember with the TI-83+ et al. them openly touting the new functionality to make asm easier to access. z80 assembly on the TI-84 was one of my first experiences with really low-level programming.

What the fuck were they thinking
I think they were thinking of their lucrative exam-taking market, and wanted to lock down the calculators more so students couldn't cheat as easily. Maybe other intellectual property concerns. I didn't see much discussion on KF but I might have used the wrong search terms. Folx were very pissed off about this back when it happened.
 
  • Agree
Reactions: IrishGuy088
"its your fault for buying a graphics card older than 4 years old goy"
I don't understand why anybody buys a GPU with the expectation that in the next 4-5 years, nobody will ever make a game with optional texture resolution and effects quality that exceeds what his card can currently process. You can run the game on an 8GB card, you just can't run it at maximum settings...but you can't run it at max settings on ANY card other than a 4090, regardless of VRAM.
 
I remember hearing about this dustup but not the jailbreak. I haven't followed the scene for a long time:
https://en.wikipedia.org/wiki/TI-84_Plus_series#Programming
I know this goes off topic but what the fuck TI? Assembly programming was the only way to get good software running on the thing. Hell, I remember with the TI-83+ et al. them openly touting the new functionality to make asm easier to access. z80 assembly on the TI-84 was one of my first experiences with really low-level programming.

What the fuck were they thinking
Thank you for the information. I've always wondered about getting a HP model, a coworker has one and then several others use Casios. Intentionally removing features though... that's beyond the pale. What a terrible idea. It has at least cost them my sale. I'll look into HP, those things sure are nice.
 
I think they were thinking of their lucrative exam-taking market, and wanted to lock down the calculators more so students couldn't cheat as easily. Maybe other intellectual property concerns. I didn't see much discussion on KF but I might have used the wrong search terms. Folx were very pissed off about this back when it happened.
Textbooks and stuff for teachers are written for that model. Press this, button, do this, do that. Teachers refuses to learn on their own and they're educated in how to tell students to use that specific model. They're just following the curriculum they were handed.
 
I like how it gets erect when the power supply is plugged in. I remember seeing a voodoo 5 for the first time and the whole idea of the molex connector onboard to plug additional power seemed ridiculous, almost like some kind of marketing gag. I still have such a card somewhere. Or two. It's probably worth a trillion bucks now.

Why do people even care about powerful GPUs to run the latest AAAAAAAA+++++ slop at 200 fps? These games suck anyways. They won't suck less at 144 FPS. The only modern game I finished lately was that Robocop game and it ran well enough on my ancient RX580 (and then dragged on for way too long to the point I got kinda bored but wanted to see it through). With somewhat lowering the graphics settings and limiting my FPS to 30 it ran fine. I don't think 60 FPS and somewhat higher graphics settings would have made my experience better. I always think "this is the month I'm finally going to buy a current-day graphics card" and money isn't the issue but man, I go all the way to the shopping cart button but then just cannot be arsed for what's on offer nowadays. I feel ripped off. It's the principle. I wanted to build a GPU server for medium-sized LLMs but ended getting rid of it again because it's a lot easier and cheaper to just rent the server time now and I don't want to reward this RAM-gimping behavior. No usage case for high-end GPUs for me left beyond that point. Congrats, gaming industry.
 
I wanted to build a GPU server for medium-sized LLMs but ended getting rid of it again because it's a lot easier and cheaper to just rent the server time now and I don't want to reward this RAM-gimping behavior. No usage case for high-end GPUs for me left beyond that point. Congrats, gaming industry.
I'm excited to see if Strix Halo with 64-128 GB of LPDDR5 will be good enough for LLMs and other AI models (since it will be marketed as such: Ryzen AI MAX+ 395).
 
  • Like
Reactions: AmpleApricots
good enough
That will really depend on the price. It'll have to beat the high-end Apple systems price-wise (I think the M4 would still be faster for LLMs the M2 Ultra is defintively faster for LLMs) but a row of GPUs might still be both cheaper and even faster. LLMs are all about memory bandwidth. The biggest upside will be power consumption. Even if you can afford the initial investment of the row of GPUs, the power consumption is insane and will make things pretty expensive pretty quickly depending on where you live and if you really want "on demand" AI which is honestly the only practical thing IMO. Currently as is, the hardware would need to be pretty cheap and low power to make renting server time economically unattractive.

That's not even cutting in AMDs software stack problems with AI workloads, because I have no idea what the situation there is right now.

E: Server time is also the safer bet in investment costs. The tech is still changing rapidly enough that you might end up wasting your money on hardware that might become either outdated really quickly or not necessary at all.
 
Last edited:
That will really depend on the price. It'll have to beat the high-end Apple systems price-wise (I think the M4 would still be faster for LLMs) but a row of GPUs might still be both cheaper and even faster. LLMs are all about memory bandwidth. The biggest upside will be power consumption. Even if you can afford the initial investment of the row of GPUs, the power consumption is insane and will make things pretty expensive pretty quickly depending on where you live and if you really want "on demand" AI which is honestly the only practical thing IMO. Currently as is, the hardware would need to be pretty cheap and low power to make renting server time economically unattractive.

That's not even cutting in AMDs software stack problems with AI workloads, because I have no idea what the situation there is right now.
The tradeoffs for Strix Halo: It has a 256-bit memory bus for up to 128 GB of LPDDR5-8533 memory, with a maximum 273 GB/s of memory bandwidth. Also 32 MiB of Infinity Cache but I doubt AI workloads can utilize that, and definitely not the CPU cores.

40 RDNA3.5 compute units for the 16-core and the 12-core, while the 8-core goes down to 32 CUs (according to the latest rumors anyway). So the 8-core should retain most of the performance while slimming the price. If it's the gaming-focused SKU, it may not be paired with more than 32 GB as often.

All models come with a ~50 TOPS XDNA2 NPU, but for anything the iGPU is going to be faster.

We may only see models accompanied with 32 GB, 64 GB, or 128 GB (not sure about intermediate amounts like 48 GB). At least 75% of that can be allocated as "VRAM" with AMD's software, making 24/48/96 GB available for AI. If that's an artificial limitation, more than 100 GB of RAM could be used for really big models. Supposedly, we will not see 128 GB at launch, but since it's all external and not on-package, I don't see why that would be. It's unlikely we'll see user-upgradeable memory of any type, although it might be theoretically possible with LPDDR5-based LPCAMM2.

So the hope for Strix Halo is that with lots of memory and relatively large memory bandwidth (for a CPU/APU), it could outperform graphics cards with limited memory when the model size is too big. But the performance is obviously much less than an RTX 4090/5090, maybe around the 7600 XT/7700 XT level, and you have to deal with AMD's software stack. Meanwhile, Nvidia is going from 384-bit GDDR6X and 24 GB with the 4090, to 512-bit GDDR7 and 32 GB with the 5090.

Comparing to Apple, the M4 Pro has a 20-core GPU with a 256-bit bus and the same 273 GB/s as Strix Halo, but a lower maximum of 64 GB of memory. That's really expensive but Strix Halo will be too. The M4 Max doubles the iGPU to 40 cores, with up to 128 GB on a 512-bit bus (546 GB/s).
 
Apple M series could have it all if the machines supported external PCIe graphics cards. The support has always been there (Nvidia even had macOS drivers for a long time).

I see myself buying an M3 Air due to the fact you can't find a mix of performance, battery life and price (and a RISC processor) anywhere else, but I am disappointed that the option to connect a graphics card with Thunderbolt has been taken away compared to the previous x86 Macs.
 
Mmmm, monkey...
14-137-901-04.jpg14-137-901-01.jpeg
Sadly it's an RTX 4070 Super Gaming Slim from MSI but it is cheaper than a 4070 Ti Super by a fair amount.

That said I think I've ironed out how I'm going to build my new PC. I'll either go with Micro Center's $450 AMD Ryzen 5 7600X3D/ASUS B650-A ROG Strix Gaming WiFi motherboard/G.Skill Flare X5 Series 32GB DDR5-6000 Kit bundle and an RTX 4070 Ti Super or I'll go with their $650 AMD Ryzen 7 7800X3DASUS B650-A ROG Strix Gaming WiFi motherboard/G.Skill Flare X5 Series 32GB DDR5 6000 kit Bundle and an RTX 4070 Super. The only thing I'm a bit suspicious of are the ASUS mobos due to ASUS's customer service being utter crap lately.
 
somewhat related, but since there is no thread for it: anyone has some insight into dumb TVs, or how far you can neuter a smart tv (as in options that go beyond "don't hook up the wifi" or "rip out the antenna").
There is some discussion in this TV thread. Smart TVs annoy me and I plan to go with a "commercial display" if I ever buy a new TV.
 
Back