30 years of C

30 years ago I learned C by reading the only book available on it, The C Programming Language by Kernighan and Ritchie. I didn’t have access to a Unix system (I’d accepted my first job, but it didn’t start for about a month) so I just read the book cover-to-cover and wrote out the exercises on paper. When I started the job I had a quarter-inch stack of paper to type in and try out.

Years later I got used to hearing people complain about how hard K&R was to learn from and I found it difficult to relate. C seemed to be designed for the way my brain worked; pointers were a natural expression of what the machine was doing, structs and unions seemed obvious, the control structures were simple, and only declarations were hard to understand.

My next readings after K&R were Kernighan and Plauger’s Software Tools and an illicit photocopy of Lions’ Commentary on Unix 6th Edition. In other words, I had a heavy dose of the Unix philosophy of development early in my career, and that pretty much ruined me for IBM shops and serious AI hacking.

My first “real” programming in college was in Pascal, which seemed horribly crippled and awkward to me (the typing scheme — especially around strings — was broken, the I/O stunk, and amongst a hundred other small irritations, you had to invent extra condition variables to exit loops). I read Kernighan’s paper Why Pascal is Not My Favorite Programming Language and heartily agreed with it. When I worked for Apple a few years later I was dismayed that it was a Pascal shop, until I found that Apple had extended Pascal with many of the things that C had, just so Pascal could be a real systems programming language.

—-

My first experience with C++ was at Apple, using AT&T’s CFront, which was a preprocessor that converted C++ source code into vanilla C. My first C++ program wasn’t a “hello, world,” it was a simple use of multiple inheritance. It was too simple to fail, yet it kept blowing up. I took it to the local C++ guru (the guy who’d ported CFront to the Mac) who spent some time going over it, digging into the CFront output (which was a hash of generated code that looked a lot like Perl).

“CFront bug,” he finally declared, “It’s not moving the base pointer around correctly.” This was a harbinger; my very first program had found a bug in the compiler. The whole time I was at Apple I learned to mistrust whole swaths of C++ features. I think this made me a better C++ programmer.

Over the years, I’ve found that the various teams I’ve worked with have converged to pretty much the safe subset of C++ that I decided on, and a while ago I read the Google C++ coding guidelines and they were bang-on as well. No exceptions, use of “interface” inheritance, no RTTI, no operator overloading, and so on. Templates changed the landscape here; I haven’t done any serious development with C++ templates, other than to make use of someone else’s class libraries, but in my opinion templates are pretty much a disaster unless you’re an expert. Template meta-programming is a great example of people being clever without being responsible.

Most of the systems programming I’m doing now is in C, and I think that the simplicity of the language and the inability to fool yourself into thinking that things are free results in a better product. (You probably hear echos of 1970s assembly language programmers here. I’m aware of that).

—-

I spent about two years working in Java. When I started using C# at one point I literally thought, “Oh thank God, the nightmare is over.” A lot of this was probably tools based (I had a debugger that worked, and could handily work with projects even of substantial size, while the Java environments I’d had to use were buggy and usually collapsed around 20-30K lines of code). Some of the differences were in the support libraries; native interop was incredibly easy, for instance, and the I/O system wasn’t thread-centric and blocking. It wasn’t so much about the core languageĀ as it was about the support, and I can’t help but think that Sun continued to blow its opportunities here, for years.

—-

I’ll leave you with three rules I’ve learned, the hard way. There are lots of things I could have included; these are the desert island ones:

1. Either leave the existing brace style the hell alone and live with it, or completely re-write the code. Going the middle road leaves two unhappy parties, leaving it alone or replacing it leaves just one. And while you should err on the side of just living with what’s already there, you shouldn’t be shy about cleaning up a train wreck.

2. Good programs do not contain spelling errors or have grammatical mistakes. I think this is probably a result of fractal attention to detail; in great programs things are correct at all levels, down to the periods at the ends of sentences in comments.

“Aw, c’mon! You’re kidding!” You might think that nit-picking like this is beneath you, whereupon I will start pointing out errors in your code. It was embarrassing the first couple times this happened to me.

3. Crack open your current project. Now, delete as much stuff as you can and still have it work. Done? Okay, now toss out more, because I know that your first pass was too timid. Pretend that you’re paying cash for every line of code. If your project incorporates code by other people, wade into their stuff, too. You’ll be amazed how much can come out.

Don’t leave unused code hanging around because it might be useful someday. If it’s not being used right now, remove it, because even just sitting there it’s costing you. It costs to compile (you have to fix build breaks in stuff you’re not even using), it costs to ignore it in searches, and the chances are pretty good that if you do go to use it someday, you’ll have to debug it, which is really expensive.

It’s tremendously freeing to zap the fat in a system, and after a while it’s addictive. Read all code with an eye towards “What is superfluous?” and you’ll be amazed at how much unused, half-written and buggy crap there is in loose in the world, and how much better off we are without it.