I intercepted a SE/30 that was set to be discarded (in the late 90's), due to a video problem -- the image was a single line across the center. The solder for the vertical control had cracked and it turned out, all I needed to do was reflow it, and it was good to go. Since this was my first 'spare' computer, I promptly installed NetBSD on it, and learned unix there. I eventually got a modem attached to it, setup PPP, and got a serial console to another 8086 laptop someone else gave me, that my brother and used to run https://en.wikipedia.org/wiki/TAC_(software) to chat into the evening. I got a nuBus Network card on there too, and figured out how to wire up our main family computer too!
A year or so later, I learned about Flyback Transformers and precisely how large of a capacitor was in the vicinity, and thanked my lucky stars I didn't accidentally discharge it.
The Se/30 will always be a special machine for me :)
This is fantastic. I own a Macintosh Classic that still worked fine the last time I booted it up (about a year ago). I've been afraid to keep it powered on for very long lest the capacitors leak. When I get some time (probably after school) I will look at replacing all of the capacitors with tantalum. Then see about finding a way to connect it to the network.
There's so much classic Mac software out there to be used. It's actually pretty amazing how productive you can be on a machine so many orders of magnitude slower than today's computers. The low latency of the classic Mac UI is a testament to the genius and hard work of Bill Atkinson. [1]
I’ve been thinking several years now practically nothing I do at work on a computer can’t be done on a computer from 1995. The tools have been improved a bit and made worse in other places but the basic work if you’re not in a compute heavy field is pretty much the same.
That was sorta what fueled the smartphone revolution - everything useful could be done on a computer from 1995, and then a computer from 1995 could fit in your pocket. Then they kept on increasing in power, such that smartphones today are good enough (modulo screen space) to compete with PCs from c. 2013.
Now we're just waiting the newest form factor that you can shrink 1995-era processing power into. Brain implants?
Microcontrollers are already there, at least in terms of processing power, and they open up all kinds of possible form factors. The Cortex M0-based Arduinos can be made extremely tiny (see the Adafruit Trinket M0) and run at 48MHz. RISC vs CISC means it might not be as fast as a 33MHz Intel from 1995, but the new M4-based Arduinos are coming out running at 120MHz with the ATSAMD51 chips. When you have a 32-bit microcontroller at 120MHz in just 49mm^2 and 65µA/MHz power draw [1], all kinds of form factors are possible at speeds meeting or exceeding your average 1995 PC.
Lack of support for modern ciphers and secure protocols becomes a deal-breaker pretty quickly if you need to access the internet much (including third-party email servers).
NetBSD can run on most Amiga's with a MMU and 24MB or better [1]. That's either an A3000 (1990) with memory expansion or pretty much any model with a 68020 + MMU + RAM accelerator board. I don't know if you could get it to run on an Amiga 1000, but certainly on a 500 or 2000. So a 1987 computer... Though most of the accelerator boards that are easy to obtain today are newly manufactured FPGA based ones...
(NetBSD will probably run on older hardware than that too - it's just I remember people running it on Amiga's, so it's the one I'm aware of)
I've always wanted to try NetBSD on Amiga but I sold my MMU-equipped accelerator long ago. It'd be symbolically nice: a Unix clone running on the Amiga and Workbench clone running on the PPC Mac...
It's something I gripe about a lot. Why is Windows 10 on modern hardware so much less responsive than Windows 95 on a system that has an order of magnitude fewer resources to work with?
Sure, it can't play half the games the modern machine will, but for most other tasks Windows 10 isn't providing me much, if any, extra functionality and it is taking a hell of a lot longer to do it. Why? Are software developers just that much more terrible? Have we put too many abstraction layers in the way? Put too much extra code in place in the name of nebulously effective security? Why must we put up with this?
Sadly the situation isn't much better in the OSS desktop world. Aside from my personal opinion that those OSs are designed in such a way as to be fundamentally unsuitable as a desktop, they suffer a lot of the same bloat. Example: https://www.youtube.com/watch?v=7kvT40umKL8
The capacitors will leak regardless whether you turn it on or not. In fact, they have already leaked, and the fluid is corroding pads and pins and eating through traces. Best to get it recapped before more damage is done.
Is it possible that by preceding the "capacitor plague"[1] they are less vulnerable? I have electronics in the house dating back to the 80s that are doing fine.
The capacitor plague is a whole separate event. As bwldrbst explained, this is a specific problem with the SMT capacitors that Apple used in everything during that time period. The through-hole capacitors used for example in the Mac SE and earlier haven't had the problems with leaking. (Speaking only of the capacitors on the logic board... the larger capacitors on the analog board and power supplies are leaking too.)
A lot of early surface mount capacitors seem prone to leaking corrosive crap onto the boards they're soldered to. This seems to affect machines from the late 80's to the mid 90s. Older through-hole caps can leak too but are more robust or tend to do so through their tops.
Machines made be cheap-ass bastards like Commodore suffer from this a lot.
I think OP's point is that the risk of catastrophic failure and damage to other parts of the machine is bigger, if it is powered up when the caps fail.
Tantalum capacitors are not failure proof. Ask around about tantalum failures, usually someone has a memorable story about an old tantalum capacitor that shorted.
If you buy quality electrolytics and don't store the machine in a humid shed/attic they are likely to outlast you.
I had an SE/30 in college that I got for free somewhere (this was in the early 2000s). It worked fine, but I didn't have a network adapter that would get it on the network, so it was of limited use to me. I think it was still sitting in my parents' garage last time I looked.
Apparently (according to someone I went to college with), Apple postfixed the names of their computers with 68030 processors 'x' (eg IIfx).
They couldn't call it the SEx, so they broke convention and called it the SE/30.
At one point early in the Mac's history, Apple marketing tried to tell the world that SCSI was pronounced "Sexy", rather than "Scuzzy". I guess they'd finally been told the name of the machine's hard drive port and someone got unhappy.
This attempted name change went over about as well as you would expect.
[I think they could have capitalized on the SE/x and done a really good marketing campaign if they'd not been terrified. On the other hand, opinions like this are one of the reasons I'm an engineer and not a marketing type... :-) ]
I've collected a few old Macs, including two SE/30s, a load of LCs, Quadras, and Performas, and even some older Apple IIc, IIgs, and IIe models, and newer coloured iMacs. I also have a box of iBook G3 parts, including AirPort cards. Right now they're in the grenier of my parents' house, and I'm starting to admit that I won't get a chance to use them often. I'm sure there's value in it all, but I'd just prefer that they go to someone who truly values the machines.
If you live near Geneva or want to pay shipping, then please get in touch.
I'm currently trying to rig up a Macintosh Plus I bought with a scsi2sd hard drive and internet connection. It's fun! But I've also discovered just how long a history Apple has of choosing "technically better, but much less well adopted" tech in their products. For instance:
- That RS-232 port in the back? Haha, nope! It's RS-422, which looks the same except it has a different pin layout. Adapters exist, of course, but 232 is much more common.
- 3.5" floppies - neat, I remember this form factor from my childhood fondly. Wait.. no, these are 400k or 800k special format floppies, which aren't interchangeable :(
I'm pretty close to getting a working dev environment on it though, and then the sky is the limit :)
Rather than assuming that choices were made to confirm your biases, you might want to take a look at what the engineering realities were when the choices were made.
For example, you write as though there was a standard for low-level formatting of 3.5in floppy disks when Apple made the switch from Twiggy to Sony drives for Macintosh, and as though Apple intentionally chose not to use it.
Furthermore, Apple’s choice of GCR encoding probably had to do with not changing the hardware design too much as part of the switch; using MFM would’ve required a different floppy controller and therefore also different signaling, etc. Engineering trade-offs, not some sort of “Let’s be different!” aspiration.
When Apple adopted high density floppies, there a standard, so Apple used it and the low-level format used (MFM) was in fact compatible across vendors. That’s why you can still read and write such disks in a USB floppy drive today.
> Rather than assuming that choices were made to confirm your biases, you might want to take a look at what the engineering realities were when the choices were made.
Whoa, calm down there partner.
I wasn't trying to claim it was necessarily an intentional choice to buck the common trend at the time (although my word choice does sort of suggest that), but rather that with the benefit of retrospect, it does fit the pattern.
I'm not super familiar with the tech climate at the time, but it seemed the IBM PC was pretty clearly going to be a big deal, and a lot of computer makers would clone aspects of the PC just for that fact. So, Apple at least chose not to do that.
Again, not trying to make any moralistic judgements here. I just thought it was interesting.
> but it seemed the IBM PC was pretty clearly going to be a big deal, and a lot of computer makers would clone aspects of the PC just for that fact
At the time, I don't think it was clear that the IBM PC was going to be a big deal. The PC was not IBM's first foray into personal computing, and its previous offerings in this area (such as the 5120 and the Datamaster) were not particularly successful. The PC was much more successful because IBM learned from its mistakes – it used mainstream/commodity parts in preference to IBM-proprietary ones, which saved it money, and then used that to enable a lower (and more competitive) price point. But, were people expecting that from IBM? Probably not, it was a deviation from IBM's previous behaviour, it was a surprise.
Now, as 1982 and 1983 rolled on, it became increasingly clear that the IBM PC was a success. But by then, Macintosh development was already well advanced, and they'd already locked in to incompatible designs. Trying to go back and redesign for increased IBM PC compatibility would have just delayed them even further, and a further delayed Macintosh might not have been as successful. Plus, increased IBM PC compatibility might have meant things like 5.25 inch disks, which would have been less user-friendly. (PCs didn't get 3.5 inch disks until later.)
At the time, at least the Amiga and Macintosh both still looked like possible viable contenders, and part of the reason for making different choices was that it was done to justify buying a different platform.
If you first sacrificed PC compatibility, then you had very little reason to stick to e.g. PC format floppies etc. if you could do better (the Amiga also had it's own format - 880KB; though tools like "crossdos" let you mount and read/write PC formatted floppies). That accounts for a lot of the quirks of these platforms.
Another aspect was that the market was much more fragmented. The Mac "owned" the desktop publishing segment at one time due simply to Quark Xpress. The Amiga and Atari ST owned the 16bit part of the hobbyist/game market in Europe, and the Amiga owned the video/graphics effects market for some time thanks to the VideoToaster etc.. Interoperability was not as much of a concern yet. People were used to a highly fragmented market with no clear single winner where you picked different platforms for different tools.
It was first a few years into the 90's that this really changed and the PC emerged as a clear enough winner that people started investing a lot in interop.
> but it seemed the IBM PC was pretty clearly going to be a big deal,
Maybe. But then we still had machines like the Amstrad PCW.
> Early models used 3-inch floppy disks, while those sold from 1991 onwards used 3½-inch floppies, which became the industry standard around the time the PCW series was launched. [...]
> All models except the last included the Locoscript word processing program, the CP/M Plus operating system, Mallard BASIC and the LOGO programming language at no extra cost.
Serial did imply RS-232 because it was an established industry standard, and by the 80s more than twenty years of hardware supported it. It's still in use today on technical equipment, although most of the world has moved on to its descendant - USB.
It's true that RS-422 was technically better, but it was only ever used in the Apple ecosystem.
Existing RS-232 hardware needed an adaptor or a complete redesign. This severely limited the appeal of the Mac in industrial automation, scientific control, and electronic development, and often made basic printing more complicated than it needed to be.
Realistically, most of those were never going to be prime Mac markets. Even so, sales and market niches were lost to the PC and to competitors like HP.
In the 80's a huge variety of systems used entirely different options altogether. RS-422 looks weird today because the other weird options have died out or become even more niche.
E.g. Commodore used IEEE-488 variations on it's 8-bit lines. Many others were used by others. Peripherals were commonly expected to be specific to a specific computer manufacturers products still.
In retrospect it's clear it wasn't ideal, but at the time RS-232 was just one more alternative that didn't really stand out as a winner yet.
Its my understanding that the reason for RS-422 (or 485) versus 232 is when you want long cable runs (say, over 10m). Not sure why this would matter much for computer peripherals.
It would make more sense if AppleTalk used it, but it used a common ground, which eliminates the benefits of the balanced signals.
It was a unified high speed interface that could do both. Farallon came up with cheap network adapters that allowed Macs to talk to each other (and a LaserWriter!) over local phone wiring.
That was huge -- plug and play networking for mortals.
What are you missing for a dev environment ? Both MPW and Think C are available to download.
Slightly newer Macs are easier to use, particularly if they have ethernet, you can run Netatalk on a UNIX system to provide a file server. You can use any 3.5" floppy in an FDHD drive.
Eh, in terms of floppies, it seems like everybody used different on disk formats back in those days. Commodore had one, IBM had one, Atari, etc... All incompatible to some degree.
how long a history Apple has of choosing "technically better, but much less well adopted tech"
That's how I'd put it. When I found out that the replacement for their 3.5" 'Superdrive' cost $400 (while functionally-equivalent PC parts cost 1/10 of that), I started thinking of leaving Mac hardware behind. When they dropped serial ports I dropped them.
These old computers are very easy to work on and there are plenty out there in broken or semi-working state in need of rescue. Good work saving another classic from the scrapheap!
I’ve been collecting compact Macs over the last year or so. I’ve got 3 SE/30s and one SE.
My most recent SE/30 exhibited this failure, but came back to life after I repaired all the axial and SMD capacitors.
I currently have it working with 80MB of RAM, and a SCSI2SD.
It’s got the following OS installed:
* System 7.5.5
* A/UX 3.1
* NetBSD 6
* debian Linux
I’m currently working with one of my electronics guru colleagues on designing a new PDS slot 100Mbit Ethernet card for the SE/30, using a modern Ethernet interface chip, since working Ethernet cards are getting hard to come by.
Hey! This "simasimac" thing appears to be exactly the problem I have with my SE/30.
I was planning to get A/UX on the thing a few years ago and see what I could still do with it when the failure appeared and I kindof shelved the whole thing. Happy to have a log I can draw tips from.
I'm 43. I remember the launch of the Mac. And the Amiga. And the Commodore 64. I feel very old now.
Though not as old as when my 9 year old found out that I didn't have Youtube as a child and followed it up by asking if we had electricity back then...
A year or so later, I learned about Flyback Transformers and precisely how large of a capacitor was in the vicinity, and thanked my lucky stars I didn't accidentally discharge it.
The Se/30 will always be a special machine for me :)