anyone here into retro computing

  • 🔧 Actively working on site again.
Adrian Black recently cleaned up a portable CRT TV that was caked in nicotine. He got some good results with Windex and a piece of magic eraser.

Now that is very interesting. Anyone have any suggestions for systems that are internally covered in nicotine tar? I couldn't come up with any solvents that wouldn't also destroy the components, especially since they are 30+ years old and vulnerable.
 
  • Thunk-Provoking
Reactions: Pee Cola
Now that is very interesting. Anyone have any suggestions for systems that are internally covered in nicotine tar? I couldn't come up with any solvents that wouldn't also destroy the components, especially since they are 30+ years old and vulnerable.
My first thought is isopropyl alcohol, but maybe it's not strong enough to move the tar? I've seen videos where old boards have been cleaned with warm soapy water and a scrubbing brush, but my biggest concern would be solder mask coming off due to scrubbing followed shortly by trace damage.

It's a pity that ultrasonic cleaners are so expensive and are one of those items that most people don't get a lot of use out of, as that would be the best way to attempt cleaning this board.
 
You can throw some PCBs into the dish washer and it's actually a very gentle way to clean them if there's no place that can trap water and you dry them immediately afterwards. I would possibly be more careful with multi-layered boards with the chance of moisture getting trapped between the layers, but it shouldn't be a problem when left to dry out thoroughly. I also would not necessarily do that with a dishwasher I put dishes in I eat and drink from. (at most, after a few empty cycles)

We did this at a work place and PCBs used to come out factory-new and worked just fine. These were boards that spent a prolonged time in hostile and dusty/dirty environments. It's very counterintuitive as you learn from small up that electricity + water bad but when a normal PCB (not stuff like paper etc.) is not powered it's just a hunk of plastic, fiberglass and copper/lead/tin, all in all not really super susceptible to a bit of water. Scrubbing them under running water like in some of the videos linked is fine too, as long as you don't break off any components and don't use super hard bristles. If traces and solder masks come off that way they were damaged and wonky to begin with (e.g. battery damage) if you have a wonky board like this, a radical approach in my experience is the best to really remove all the battery damage to not be stuck with intermittent problems. You might scratch some silkscreening though, a lot of the older silkscreening in my experience is surprisingly easy to damage. Newer PCBs are basically unbreakable in every way in comparison.

Also probably not a good idea to do it every sunday, if you catch my drift.
 
  • Thunk-Provoking
  • Informative
Reactions: XYZpdq and Pee Cola
Scrubbing them under running water like in some of the videos linked is fine too, as long as you don't break off any components and don't use super hard bristles. If traces and solder masks come off that way they were damaged and wonky to begin with (e.g. battery damage)
Some boards are more likely to have poor quality solder mask than others; Sinclair boards are notoriously bad, but I've seen it on other 8-bit main boards from machines such as the Commodore 64 and Acorn Electron. A telltale sign is solder mask that looks crinkly.
 
Some boards are more likely to have poor quality solder mask than others; Sinclair boards are notoriously bad, but I've seen it on other 8-bit main boards from machines such as the Commodore 64 and Acorn Electron. A telltale sign is solder mask that looks crinkly.
If you mean that, That crinky "bloated" solder mask some of these old boards have on the backside is because through manufacturing there's globs of soldering tin on the copper where the solder mask is supposed to go directly on the copper. It's true, the solder mask on such boards has no footing and just falls off at the slightest provocation in these places. It doesn't have much of a function there to begin with though as the copper is already appropriately protected by the tin and the manufacturing process is long over. Of course, if you want to keep it as untouched as possible for keeping stuff original as possible, agressively washing such boards is not optimal as that stuff will come right off. It won't affect the function though. If you're paranoid about shorts because of wide areas of exposed tin, there's sprayable and paintable solder mask.

The last time I seriously followed the developments of the communities people were developing new PCBs for eg. the C64 and Amiga too. I can only guess that they're done and buyable at these point. What always bothered me with these designs was that they are straight up copies of the originals, even though it would make sense to do a few design changes in many cases as the original boards were often not that optimal and don't take into account small and useful changes you could do without getting too far away from the original. I'm not talking about plugging some hyper-advanced FPGA core on them, more like small and other useful things, like using more usual connectors that are actually still manufactured, or accepting more widely available modern SRAM with the appropiate support circuity, or I/O protection for the external connectors to protect some of the old ICs better and such. Especially the C64 is very vulnerable there.
 
  • Thunk-Provoking
Reactions: Pee Cola
I'm not sure where else to put this but what do you enjoy mostly about this old computer crap? For me it's mostly the simplicity many of these systems had modern systems don't really have, in the way of their implementation. You get what's written on the tin, your shell program doesn't have an inbuilt webserver, your OS' kernel doesn't have a billion of moving parts that literally change every week, the hardware design is easily understandable, that kind of thing. Also it was easy to implement software. You didn't need a dozen of libraries and some bizarre numbers of line of boilerplate code to please the GUI standardization gods just to e.g. draw some boxes on the screen. You also didn't have to deal with cruft of 40 years of software development heaped on top of itself, as this technology just wasn't that old.

There's quite a few modern developments as in, "retro-tech" inspired setups that use an ATMega, ARM or similar modern processor running at more "modern" speeds. (Even the tiniest of the ATmegas are quite a bit faster than e.g. a reasonably fast Z180 I used in an own design) It's kind of a new development which probably is helped a long largely by the ease of designing such electronics. I'm not talking about the FPGA implmenetations of old computers necessarily, since these are largely just copies of old platforms and not something entirely new. I'm more talking about something like the Colour Maximite 2, which is a turbocharged BASIC machine on an M7 ARM core and has just enough memory, cpu horsepower and multimedia capabilities that you can do attractive looking stuff with it. It kinda feels like it's own thing which has it's roots in this retro hobby but is something a bit else. I wonder if it can manage to stand on it's own feet or will forever be attached to nostalgia.
 
Last edited:
Short answer is yes. I have an active Win 2000 machine. I play games on it. I can use my printers on it I also can do my content creation on it.

The definition of what is obsolete is made by the person. NOT the company. There is a difference on going retro for a fad/nostalgia... and using the equipment for its purpose.

It is a proven fact that code back then had to be tight. And you have to marvel what was put on a floppy to make things work.

I finished up my first comic suing a MAC SE 30. Wonderful machine. But the software was HIDIOUSLY EXPENSIVE. So expensive that completely went Wintel. It's kind of funny. Hardware wise the prices were close and at times the MAC's were cheaper.. believe it or not.

But the problem was proprietary hardware and expensive proprietary software.
Back the the learning curve for PC's was harder than Apple OS. But Windows 3.0/3.11 OS environment was rock solid and almost idiot proof.

I'm seeing a pattern here. In my early days I would keep a computer for at least 5 or more years before upgrading saving my money in the process. It was until 2017 that I was able to make a full upgrade using tope tier quality components. 2017 was a new rig. 2019 was a new rig. 2021 however was an update. Just replacing the CPU.

It looks like I will be making small updates or going back to a new computer every 5 or so years. Like I used to do Even though I am a wealthy man I am not fucking stupid. I can not justify buying a new top tier computer, when I can buy a used car for the same amount of price.

So this all comes down too to this... What is obsolete. What the corporations who spend millions of dollars each year to tell you what you need or what you can use and reuse to make things helpful in your life.
 
  • Feels
Reactions: Pee Cola
To be honest, the loud, garbage disposal sound of the Apple II drives just remind me of the promise of the 80s computer revolution. Then that satisfying hmmhmmhmmhmm sound as it finished loading. As kids we all got used to that grinding just to load a 8k program at school. That, and alot of the games I enjoyed then never got ported anywhere else. Sure, Oregon Trail and the Carmen Sandiego games did, but some of the more bizarre shit I had pirate copies of in 1987 never will.

Don't get me wrong, I don't miss having to use floppies or DVDs or anything else on a modern computer, but they were nice for what they were.
 
  • Feels
Reactions: Pee Cola
Anyone here know roughly what's the most recent USB mouse that will still work with a PS/2 adapter? I'm reading that those green adapters that used to come with every mouse don't work anymore once people stopped designing mouse circuitry for that use case. Does anyone still support it? Or for that matter, are there any maniacs still making new PS/2 mice?
It would be nice to have something that's not a ball mouse or janky first-generation optical.
 
Anyone here know roughly what's the most recent USB mouse that will still work with a PS/2 adapter? I'm reading that those green adapters that used to come with every mouse don't work anymore once people stopped designing mouse circuitry for that use case. Does anyone still support it? Or for that matter, are there any maniacs still making new PS/2 mice?
It would be nice to have something that's not a ball mouse or janky first-generation optical.
Yes, the adapter is entirely passive, basically just does the wiring to the port. The protocol is in whatever IC that's in the mouse and detects ps/2 vs usb automatically. A few years ago I would've told you to buy the cheapest possible, but even that isn't a safe bet anymore.

The old school Cherry (yes the switch company) M-5400 still supports PS/2 and in my corner of the world is buyable new for less than ten euros.
 
A few years ago I would've told you to buy the cheapest possible, but even that isn't a safe bet anymore.
Actually that seems to be precisely the solution. I didn't think to check, but if you look up Microsoft's cheapest business mouse, lo and behold, they even have a picture of it with a PS/2 adapter.
 
  • Like
Reactions: Smaug's Smokey Hole
Actually that seems to be precisely the solution. I didn't think to check, but if you look up Microsoft's cheapest business mouse, lo and behold, they even have a picture of it with a PS/2 adapter.
The cheapest mouse Logitech has (or had) the B100 or B200 or something did not work with PS/2 anymore when I bought one about a year ago, it's sadly really not a universal rule anymore and research first is necessary, like you did. My guess is these basic mice don't get chipset refreshes often because for what, but if a component in the supply changes (e.g. different sensor) it might be necessary to update whatever controller they bake and that might be the time PS/2 gets kicked out of the design.

PS/2 is generally much nicer for tinkering too that's why it sucks that these simple ports disappear. At least in the keyboard world it seems to take a longer and there will always be these sperg keyboards with microcontroller where it'd be trivial to have the controller do PS/2.
 
  • Like
Reactions: Joe Swanson
I used to do that a lot, but I had to sell all my old shit as I was living with my parents, and didn't really have the space. Now, when I have all the space in the world for my autistic shit, the old PC's are too expensive ;-;
 
Today I found my Creative 3D Blaster (with 12 MB of VRAM!) Voodoo 2 card and my Matrox Mystique. (complete with "Rainbow Runner" addon for mpeg-accelereated video en- and decoding, I think I digitized some VHS-Cassettes with it?) Good Times. I thought I gave them away but apparently not. Back then I bet totally on the wrong horse with Matrox and I kept holding on up until the Parhelia (have one of those here somewhere too) - I did end up having Voodoo cards because that was the only way to play some games "correctly", though. Now was the 12 MB of the 3D Blaster some marketing wank or were there actually games that did anything with that? I genuinely don't remember.
 
Today I found my Creative 3D Blaster (with 12 MB of VRAM!) Voodoo 2 card and my Matrox Mystique. (complete with "Rainbow Runner" addon for mpeg-accelereated video en- and decoding, I think I digitized some VHS-Cassettes with it?) Good Times. I thought I gave them away but apparently not. Back then I bet totally on the wrong horse with Matrox and I kept holding on up until the Parhelia (have one of those here somewhere too) - I did end up having Voodoo cards because that was the only way to play some games "correctly", though. Now was the 12 MB of the 3D Blaster some marketing wank or were there actually games that did anything with that? I genuinely don't remember.
It made some games run faster due to having more texture memory, their setup meant that the 12MB card had twice as much texture memory as the 8MB one because the reserved memory for the frame buffer was 4MB on both cards. It played some difference in later games but I have no idea to what extent, you actually bring up a great question with that one.

I really liked Matrox for some reason, I had their PowerVR accelerator, G200(for my Voodoo 2's) and later the G400Max. For a company targeting the professional segment you'd think they would have a working OpenGL driver at some point.
 
I really liked Matrox for some reason, I had their PowerVR accelerator, G200(for my Voodoo 2's) and later the G400Max. For a company targeting the professional segment you'd think they would have a working OpenGL driver at some point.
I have a honest memory that the general picture quality was better on Matrox cards for a long time, although I have absolutely zero proof to back that up. They were some of the first cards that have DVI ports of what I remember though. Good stuff. Although yes, an absolute joke that their 3D stuff was so crippled and it all started so promising with the Mystique too. Mech Warrior 2 was pretty cool on it.

I have an older AMD K6-III system I could set up to test these theories but I gotta admit I use the small embedded MediaGX board I have for all my PC retro needs these days, because having a Pentium era speed DOS/Win9x system roughly a bit larger than a Pi and taking CF-Cards as harddrives (without adapter, just directly) is just that practical. I don't really have much of a soft spot for the early 3D accelerated games either, although there were some very good ones.
 
Neat hobby.

I wish i could get involved with it, looks fun. But I think its the aesthetics and marketing of old computing that really interests me. It had a particular optimism and clean hope, especially in 90s when it appears as though computing, networking and early internet was taking off.

If you got any neat material to share from that end i would love to see it. Otherwise back to scouring through old magazines at archive.org
 
  • Feels
Reactions: Pee Cola
I have a honest memory that the general picture quality was better on Matrox cards for a long time, although I have absolutely zero proof to back that up.
They did! They used high-end RAMDACs and tons of filtering to make sure that the digital signal could pass as flawlessly as possible over the VGA cable in an analogue format. I don't know the technical terms, their signal just didn't get as crusty on the way to the monitor and they spent money to make sure of that. The Matrox Millenium marketed itself as having something like a 250mhz RAMDAC and a bunch of other crap to make sure 1600x1200@24bpp over an analogue cable looked as it should.

There were other, cheaper cards that had really good image quality but only at lower resolutions and at a normal refresh rate where their RAMDAC could keep up. It all started going to downhill after that(imo it could be noticeable by going over 800x600 sometimes, even if the card supported resolutions above that).

The RAMDAC/analogue image problem was something most people didn't think about back then and not that connections are digital it doesn't matter. You could get cheap video cards with the big name GPUs like a Riva TNT or a GeForce, but the cheap cards were inexpensive because they cut corners, often on the thing processing the output sent to the monitor. Save some money and you could run the latest game with the prettiest graphics except everything is a bit flat colorwise. The worst I ever got my hands on was a really cheap Geforce 2 from an obscure board vendor, I want to say it was Tseng-Labs but they were long defunct at that time. It benchmarked exactly like it should but everything looked off like if the colorspace had been crushed a bit and there were no true black or true white. What it rendered matched the reference renderer so the image became shit during the output process.

This is just a question for old fags. Have you ever been at an old lan-party or equivalent and noticed that the game you were all playing looked better on some machines? Richer colors, no crushed blacks etc. That could have been the difference between a cheaper card and more expensive cards and not just the monitor.
 
They did! They used high-end RAMDACs and tons of filtering to make sure that the digital signal could pass as flawlessly as possible over the VGA cable in an analogue format. I don't know the technical terms, their signal just didn't get as crusty on the way to the monitor and they spent money to make sure of that. The Matrox Millenium marketed itself as having something like a 250mhz RAMDAC and a bunch of other crap to make sure 1600x1200@24bpp over an analogue cable looked as it should.
Ah yes. Earlier in the early 90s Cirrus Logic had pretty much the only chips that had the RAMDAC integrated on the die of the main graphics chip and not only was it cheaper, it really helped matters of quality too. There were actually really wide disparities in output quality now that I think of it. Before it became standard and in ISA times, the VGA cards who could do more than 256 colors were also pretty expensive. The graphics chip usually could always do it, but they were limited by their RAMDAC and only the expensive ones would do truecolor.

I wouldn't blame it all necessarily on a faster/slower RAMDAC though, layout of the board itself played a role too. when you have higher-speed signals like this you can ruin a lot by very little, for example by picking the wrong ferrites on the output. Also how tolerant and well made the monitor is plays a role. You usually get a pretty shitty picture out of modern monitors that still have a VGA port, and not only because of the scaling but also because the signal processing of VGA is probably more of an afterthought in modern LCDs and they're probably just not as tolerant towards slightly askew signals anymore. I noticed that one a while ago when I was playing around with a 286 and various ISA graphics cards. I actually had pretty wild variations in picture quality between the different cards. I then used the OSSC I bought and these variations just magically disappeared. There's just a lot that can go wrong with such relatively high speed analog signals. I didn't quite remember it being such a problem for these later generation PCI graphics cards at lower resolutions, but tbh it probably still was.

Apropos of nothing, I just (well "just", a few months ago) had to replace the DAC on my A1200 because the old one somehow died. Never had that happen before and it's kinda worrisome. (yes the capacitors are all new and I had to remove one to remove the old DAC)

pic1.jpgpic2.jpg
 
Last edited:
I didn't quite remember it being such a problem for these later generation PCI graphics cards at lower resolutions, but tbh it probably still was.
Thanks, that's informative, I just remember "RAMDAC!" at this point. It was the marketable hook in Matrox advertising. It had mhz in the 90's.

When it come to graphics card and how different the output could look, but maybe not picked up on by most people, is probably further muddled by different accelerators rendering things differently. Plus the monitor itself, of course.
This was eventually highlighted in benchmarking where part of the test suite on some sites included Nvidia/Ati/S3/Matrox frame dumps that was compared to the exact same frame rendered correctly in software using the DX reference rasterizer. There could be some pretty wild deviations compared to the reference, not even mentioning 3dfx here, and that was before the frame started its way out to whatever crummy 14" monitor people were playing Quake on.

I think it was with DX8 or 8.1 that Microsoft put their foot down and decreed that there will be zero deviation between any pixels in the reference and the render output or else the GPU will not be considered a DX8(or 8.1) capable CPU. I remember thinking that it seemed like a tall order at the time, given the state of things.
 
Back