ChatGPT - If Stack Overflow and Reddit had a child

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.

An old video from 1984 I stumbled across about AI. These were about Stallman's (yes, that Stallman) original domain: expert systems and symbolic AI. Symbolic AI (or apparently "classical AI" as some have started to call it) showed it's limitations fairly quickly, especially in it's inherent brittleness, difficulty of scalability (especially with the hardware of the time, even though special architectures were proposed, like this japanese AI accelerator from '90) and it's static-ness and difficulty to actually extract information from an expert system in an articulate way. The optimism and true-AI-is-around-the-corner talk you still catch in this video eventually had reality catch up with it and we had our second AI winter. The talk now is "we just need more data" the talk back then was "we just need to find a way to break up the world into it's smallest set of rules". Expert systems were actually not useless, survived the AI winter in a few forms and were abandoned way too quickly, IMO.

There's some talk recently about a hybrid approach, leveraging machine learning and the reasoning and deduction powers of symbolic systems. Neuro-symbolic AI. You see applications of this in LLMs using tools.

If you want to play around with an interesting AI program from that era, I can recommend SHRDLU. It's written in Lisp, which was *the* Language of AI. You can download it here.
 
It's funny to me how the memories shit makes it hallucinate so hard
 
hey guys i'm trying to run my own personal AI and its acting pretty weird, can you tell me what i should do?

1752862452481.webp
 
I found this interesting github with leaks of the system prompts of a few different models.


The full claude one is ~25k tokens. Insane. I learned that Deepseek R1, just like claude, really likes these faux XML tags and will pay special attention to everything that's between them.
 
Last edited:
Has any of you fine faggots tried out an AI Aggregator like Poe?
That's from quora right? I remember it being a bit popular at launch because it was near free but last I checked it was more expensive than the APIs, is it cheap again or what?
difficulty of scalability (especially with the hardware of the time
People don't get how insanely powerful hardware is now, for example a snapdragon SoC from 5 years ago has 4 TFLOPs, meanwhile ASCI Red the supercomputer with over 9200 pentium pro CPUs from 1997 had......1.6 TFLOPs. The 1999 upgrade to pentium II got it up to 3.1 TFLOPs. This for a system that consumed 850 kW of power, occupied 1,600 sq ft, and cost $100+ million.
The on-chip weight storage is like what TPUs do now, crazy how Japan kept going thru the AI winter then completely missed the current AI train, then again that thing happened right after the asset bubble thing.

Can't find anything about that IP704 chip besides that link.
 
People don't get how insanely powerful hardware is now
What really drives this home to me every time is seeing small microcontrollers for 5-10 bucks who performance-wise run circles around the first few generations of "serious" computers I owned which cost thousands and were state of the art. I know my smartphone etc. is also more powerful than them but somehow, these MCUs really drive that home for me. They're even multicore now! The things we would've made if we had them back then. One of the first microcontrollers I programmed had 256 bytes (not kb) of RAM. These powerful MCUs are in such mundane everyday items, often doing simple tasks that are way below of what they're capable but making them more primitive or cutting features simply isn't worth it. Somehow that's even more crazy to me than smartphones.

Can't find anything about that IP704 chip besides that link.
It probably never left the concept/prototype stage is my guess. A lot of stuff also disappeared off the internet. All the big players stopped financing AI and things sorta fizzled out in the 90s with holdovers into the 00s. When you read about the history of AI you get the impression that people shouted "AI is over! Stop your work now!" from the rooftops on 31. Dec 1989, but it was really more of a gradual decline in interest and financing that already started in the 80s. The funny thing is that the goals in which these companies were financed by investors were often a very vague "make computers intelligent". Sounds familiar? That said, expert systems never really disappeared. They're still all over scientific papers and probably are still to this day represented in a lot of internal software tooling of corporations.

There were also offerings like the the TI Explorer, which basically was an AI/Lisp (these terms were interchangeable at that time, really) CPU card by TI.
maccardx.webp
(The picture is from some article about a guy putting together such a machine IIRC. I don't have a link on hand but I'm pretty sure you can find it easily when googling for these words, talk about this stuff is very rare, although this was a product that was actually delivered and people still own now, you can find pictures)

The reason this stuff is rare is because it didn't sell. Non-specialized systems got cheaper and faster. Even if you wanted to do expert systems, you simply didn't need this stuff.

The current AI iteration got much farther than that one ever did. We made a HUGE jump in NLP which has been the holy grail of computing since computers conceptually existed and had seen pretty much no usable progress until very recently. I can tell an LLM a story or a poem I wrote (so completely original text) and not only can it process it and summarize it, it can interpret meaning and subtext too and probably do a better job than the majority of humans on earth. It can then give me it's results in natural human language and even clarify or answer questions about it. That is huge. It's crazy how blasé people are about it or try to argue semantics while the actual results stare them in the eye.

crazy how Japan kept going thru the AI winter then completely missed the current AI train
This is pretty much true for most of the west. Nobody wants to hear it but all the good stuff and papers come out of China right now.

EDIT: Also found this on my harddrive, probably from the same article:
nonoftsker.webp
 
Last edited:
It's crazy how blasé people are about it or try to argue semantics while the actual results stare them in the eye.
Its a cope, they don't want to admit it, just like artists after seeing what SD with a proper workflow can do.
What really drives this home to me every time is seeing small microcontrollers for 5-10 bucks who performance-wise run circles around the first few generations of "serious" computers I owned which cost thousands and were state of the art. I know my smartphone etc. is also more powerful than them but somehow, these MCUs really drive that home for me. They're even multicore now! The things we would've made if we had them back then. One of the first microcontrollers I programmed had 256 bytes (not kb) of RAM. These powerful MCUs are in such mundane everyday items, often doing simple tasks that are way below of what they're capable but making them more primitive or cutting features simply isn't worth it. Somehow that's even more crazy to me than smartphones.
The things I seen people doing with an ESP32 its nuts. Still when most people think obsolete they think a pentium 1, but nowadays even a lowend phone SoC wipes the floor with a PS3, and one from 3 years ago easily surpasses a PS4 (2.3 TFLOPS vs 1.84) tho that power goes mostly to browse instagram and tiktok, stuff even a X360 could do. Sucks nobody is porting even 7th gen games to phones, but there's gatchashit everywhere.

BTW what are your thoughts on those 1bit LLMs?
A lot of stuff also disappeared off the internet
Its called the splinternet for a reason, sad when you consider most of the 90's internet could fit on a home NAS now. Don't worry tho, there are 3000 backups of some random cat's pic that was taken yesterday, now that's valuable information!.
All the big players stopped financing AI and things sorta fizzled out in the 90s with holdovers into the 00s.
I take the internet caught the investors' attention, it was nuts seeing even as a kid back then how everybody was jumping in into something they barely understood. Most just wanted to capitalize on the hype, SEGA had these internet-ready Saturn bundles for sale:
1753111911360.webp

This is pretty much true for most of the west. Nobody wants to hear it but all the good stuff and papers come out of China right now.
The west is trying to forget deepseek happened but there's no going back. Question is if the Chinese can pull a deepseek-tier leap on hardware too. I got to test some chinese cars recently and they also got their shit together there, I know because I seen what their cars used to be a decade ago. No amount of tiktoks of chinese EVs on fire is going to convince me they aren't beating us there, plus they are really into automation, their solution to rising labor costs and worker scarcity isn't to import a billion muslims but to make robots that can assemble a car in almost total darkness because IR cams are cheaper than lights for an entire factory.
 
Last edited:
  • Like
Reactions: whitekyurem
Still when most people think obsolete they think a pentium 1
I don't know man, I used to think that until I realized that a lot of people working in the programming industry weren't even alive when the P1 was released. For them core2duos are ancient things they connect their first childhood memories to. If you talk to a young person who is computer-interested but only has a passing interest in computer history, it's kinda funny how they just mix these two early decades together in their mind, imagining a pentium running win98, a C64 and an Amiga somehow coexisting on a competitive market for the computer professional. I blame the retrocomputing community for this misconception. I loved the Amiga and it was my first system, but if we are being honest, it was usable maybe for around four years for most users until you moved on to something bigger and better. If you still used an Amiga in the 90s, it was either because you absolutely had to, fell to sunk cost fallacy and/or you just weren't interested in new technology that much. I still used my Amiga past that and it had it's niche use, but I had no illusions. A four year old computer now is e.g. some really recent and decent ryzen system that's more than good enough for pretty much all you wanna do as average user including gaming, a four year old computer in the 90s was a doorstopper. There was the odd 486 that survived until 2003 as grandmas email machine, but that was the exception, not the rule. I noticed young people do not always understand this and I think it's because the retrocomputing people play the significance of their special interest systems up. Most of these systems were stepping stones. Not much more.

BTW what are your thoughts on those 1bit LLMs?
I have no opinion. I kinda lost all interest in running things locally or self-hosting when APIs became so cheap. From the smart models I can run off APIs I couldn't even pay the electricity bill for running locally or fees for self-hosting. For what it's worth, my last view on the matter is that any level of quantization damages current gen models too noticeably. You could get away with it on the earlier, more undertrained models, the MoEs and modern dense model just suffer too much.
everybody was jumping in into something they barely understood
People do this with current wave AI too IMO. Seeing a lot of "groundbreaking frameworks" written by people who obviously don't understand how LLMs work. There's a lot of "AI powered" stuff coming out that's complete garbage, but that's just how progress and the markets work, I guess.

got their shit together
They absolutely do. Something I used to keep repeating to coworkers more than 20 years ago already was "don't underestimate the chinese". They have a lot of brainpower and the infrastructure to use it now while we in the west completely cannibalized everything for various ideological or financial reasons. This is not quickly fixable and also not a problem you can solve by simply throwing money at it. It takes patience and a serious, long-term commitment and in western leadership, I see neither. It's going to get a lot worse before it gets better, if it ever does.
 
I kinda lost all interest in running things locally or self-hosting when APIs became so cheap. From the smart models I can run off APIs I couldn't even pay the electricity bill for running locally or fees for self-hosting.

But you can't trust those niggers to tell the truth and not narc on you like you could if it were running locally. That's worth more than its weight in gold.
 
Back