Revolutions That Weren’t

Bubba Memory

Bubble Memory was going to change the face of personal computing. Well, this was the late 70s, and we weren’t exactly at personal computing yet. If you were computing at all, you were likely an odd duck like me; a computer hobbyist, emerging from the basement to announce victories involving something in assembly language for a home-brew machine cobbled together from 7400-series TTL and a chip from a company named Zardoz. Bubble memory was going to change all of that stuff and revolutionize the heck out of those still puzzling personal computers, once we figured out what personal computers were good for besides playing Hunt the Wumpus in BASIC.

The late 70s struggle for storage was as wierd as the pants we were wearing then. Bubble memory was, well, bubbles, circulating around racetracks etched on a chip. Think: mercury delay lines, but in silicon. Think (though it was in the future) of that wacky Sir Clive Sinclair and the recirculating train wreck that was Floppy Tape. In those days, 256K of non-volatile memory that you could hold in your hand was a big, big deal. So was getting a date, or getting a successful program save on your cassette tape recorder.

When largish EEPROMs appeared, alongside MFM hard drives that didn’t suck, Bubble memory largely vanished. So did our funny pants.


Prolog – The Architecture of Fear

Remember that time in the 1980s when the Japanese and their super-computing AI machines rose up and just steam-rollered the American software industry? Politicians were up in arms about the “Inference Engine Gap,” and everyone was buying Clocksin and Mellish’s book Programming In Prolog, and Shapiro’s Art of Prolog, and Borland got into the act with a cheap and schlocky thing called Turbo Prolog, and even Microsoft dusted off MuLisp and MuStar and made them run on then-modern hardware (that meant Intel 286 chips, but hey, it was the eighties).

Remember all those PhDs in Artificial Intelligence that were minted on the wave of fear just before the fast, smart machines arrived from the Orient and turned Silly Valley back into orchards? It got so bad that start-ups began putting “No Procedural Programming Experience Desired” in their job ads. If you used FOR loops or could spell Pascal you were burger-flipper material. If you knew what CONS did (or could even hum it a little) and could describe a “cut” you were in golden handcuffs and chained to a keyboard before the interview was over.

Remember the boom times of Artificial Intelligence?

Me, neither. Though there was a lot of talk about Japan’s Fifth Generation of computing, somehow it fizzled.

I still have that copy of Clocksin and Mellish, though I never used Prolog for anything interesting. And I ran across that copy of Turbo Prolog a little while ago when I was cleaning out crappy old computer books (101 Computer Games in BASIC, natch); TP was still just as miserable, only it wouldn’t run on any hardware I had. Possibly an improvement.

Lest you think I’m a Prolog hater, the book on the Warren Abstract Machine (or WAM) is a fantastic tutorial treatment on how to design a VM to support a logic language.  It’s available online here:


Ex Em Hell

It’s everybody’s favorite punching bag. Delivered direct from the heavenly spaces one IP address short of Digital Nirvana, XML was going to be the bread and jam and toast of the Millenial Computing Revolution. XML was going to cure insomnia and halitosis, raise the dead, fill our cavities and walk the dog. It was going to be fucking great and all you had to do was learn how to type < and > and know where to put the slashes.

Any start-up could get a truckload of cash just by mentioning XML in their technology brief. Soon the valley was full of people who could type < and > and knew more or less where to put slashes. If you weren’t availing yourself of this digital nectar you were on your way to Dog Town and a career swabbing out spool directories with sed(1).

What we actually got: Any number of crappy serialization schemes and over-designed and under-implemented replacements for INI files.  Undebuggable configuration files, poorly written attempts at replacements to already perfectly awful tools (yes, ANT and MSBuild, I’m thinking of you), and a lot of other smelly garbage littered with angle brackets.  After after weeks of wading through torturous and undocumented XML schemas we were sore tempted to put the slashes in our own wrists.

I worked with an XML luminary who had been involved in the standardization process. I think he claimed direct lineage all the way from the original Ampersands of Aulde Anglebargle back in Ye Old Country, where they quarried data structures with expendable mules and touchy explosives.  He barely showed up for work at the start-up he was supposed to be working at — he was more like seasoning at this company than an actual employee — and when I had questions for him, he was kind of an asshole. I was expendable, the company exploded. There you go. Watch out for falling mules.

We are still living this nightmare, with no end in sight.


“What’s that mule doing up there?”

Author: landon

My mom thinks I'm in high tech.

20 thoughts on “Revolutions That Weren’t”

  1. In between Prolog & XML, don’t forget about my favorite whipping boy – UML. Talk about a useless, overblown, steaming turd!!!

    If I see one more “use case” diagram with a fucking stick figure and an ATM, someone’s getting hurt.

  2. The only think I thought was helpful about XML was that it was the “Stone Soup” that people made things out of. The could build a serialization, remote procedure protocol, a syndication format, a data transform language that be manipulated with the same data transform language; whatever the idea. Using XML didn’t actually get them any closer to realizing their goal, but because they had XML they were thinking in terms of these interchange goals.

    OK, maybe two things helpful about XML. Encouraging the use of a data format that requires a character set definition forces the person I’m getting the XML from to think about the character set. Or at least gives me something to point to when they don’t. (you can ask them “Do you mean ASCII, UTF-8, iso-8859-1 or Windows-1252? If you mean ASCII, then that shouldn’t be there. If you mean 8859, then you can’t use these characters, but if you mean windows-1252 then those characters probably don’t mean what you think they do.” Sometimes all four in the same file depending on what data sources things came from. When they’d give me ambiguous data in a CSV file, I couldn’t point to a precise reason why it was wrong. Now that the file they give me says “charset=”, I can send it back and tell them to fix it.

    Its also interesting that of the three examples you gave, there were three different reasons they faded. Bubble memory faded because other things improved at a faster pace, and so pushed it out. Prolog didn’t catch on because the problem it was supposed to solve was never as big as it was hyped. Of course, as you implied the final chapter on XML has not yet be told, but it will eventually look as funny as the ’70s pants.

  3. @Nestroy: Except that Erlang seems to be useful for something, straight out of the box. (And its bit-extraction stuff is positively inspired).

    Armstrong’s book _Programming Erlang_ reminded me a lot of K&R’s _The C Programming Language_ — one of the best “here’s a language” introductions I’ve read in quite a while.

  4. I’ve always had an aptitude for picking up programming languages, with the exception of two – COBOL and Prolog. There’s something in my brain that refuses to make sense of the weirdness.

    I also remember the XML craze starting and trying to work out what exactly it was. It’s a system designed to slow modern computers down, right? Force my multi gigahertz machine to sit processing human readable text so it can extract the base64 encoded binary data it really wants. While validating its schema agains a DOM nobody bothers to write.

    What about UML? That was a growing craze while I was at university trying to not learn COBOL.

  5. Oh XML. Those who don’t understand it are doomed to misuse it. (Yes, I’m still bitter at Xft/fontconfig turning a pretty reasonable config file in v1 into xml mush in v2. But this is only because they insisted on blurring my fonts to hell–I didn’t want antialiasing on my CRT because the tube largely did it for me. I wouldn’t have even noticed if I hadn’t needed to fix that.)

    Anyway, my two favorite pieces on XML are mine (natch) and Slava Akhmechet’s “The Nature of Lisp”. In the end, ‘revolution’ is pretty oversold, along with the Semantic Web that XML was going to enable.

  6. I feel compelled to point out that once upon a time you were convinced XML would save us all. Granted, that was in the days of QuickServer value objects, and XML is demonstrably better. But I was the one telling you XML was not going to fix things, that in fact it would make it worse. And here we are. Nostra-flippin-damus.

  7. Konami actually came up with an interesting use for Bubble Memory! The only problem was, the limitations of Bubble Memory fit in very poorly to Arcades, but they managed to make a few games on it, and two of them were good enough to reprint on standard ROM! (Gradius, Twinbee)

    That being said, the machines took 25 minutes to start. It wasn’t good things.

  8. I worked with bubble memory on an 8088-based STD-bus system back in the 80s. The boss decided to try it because it would be more robust than floppies in an embedded environment. Speaking of floppies, I probably still have one of the old 3″ (not 3.5″) floppies hanging around in a box somewhere.

    I’ve still got my Clocksin & Mellish somewhere, but I never got involved with XML.

    1. IT projects that fail live on forever, as someone who oversold it in the beginning tries to salvage something usable to save themself.

      Successful projects get ignored, forgotten then eliminated.

  9. Not just Prolog but AI in general was the Next Big Thing. It was the rage in universities. We all know how that turned out. No one could make sense of it.

  10. Look around, XML is everywhere now – web pages, window layouts, IDE project files, data interchange files between separate systems. XML worked out. The declared promises worked out. The Earth became a better place. How dare are you to call XML an over-hyped buzz-thing.

    XML is here and it is for long.

    I wonder how such an experienced guy like you can do such arrogant statements about a nice technology that changed the world for better.

  11. @Anon: I hate XML for what it has been mis-used for. It’s a markup system that got used as a data transport system. It’s /terrible/ at that.

    We should be using JSON or BSON for most of what people are using XML for. XML /sucks/ at representing rich data and things that aren’t trees.

    I might be arrogant, but I’m also right. XML might be here to stay, but it’s still terrible. We can do better . . . and we have.

  12. Ancient thread, arise!

    Somewhere around 1993, I was studying knowledge based systems for 5th year Higher Computing, in a secondary school in Scotland. While all around the country, kids my age were writing COMAL, my teacher being the forward thinking chap he was, made it possible for us to spend the entire year writing PROLOG. Wasn’t that nice?

    This was the future! We were being prepared for the wave that was about to hit the computing industry! It never appeared. I went back to writing 68k and later x86 assembler and C to make pretty pictures on the screen.

    Still…it’s nice to learn something new and different once in a while.

Leave a Reply

Your email address will not be published. Required fields are marked *