Foreward from GNU Autotools book

May 8, 2023

In 2001, New Riders published a book called “GNU Autoconf, Automake and Libtool” by Gary Vaughan, Ben Elliston, Tom Tromey, and Ian Lance Taylor. They asked me, the originator of Autoconf and Automake, to write the Foreward, which I felt honored to do. I had handed off development of that software by then, so I took an opportunity to look back at how far we had come and hoped to help guide the future.

In the more than two decades since then, that software has settled into maintenance mode and other tools such as CMake have become increasingly popular, so many younger developers probably haven’t seen that book. I still like what I wrote to represent that era, so here it is. The editors used it verbatim, aside from omitting my suggested title for it, Magic Happens Here.

Foreward

Do you remember the 1980s? Veteran users of free software on Unix could testify that though there were a lot of programs distributed as source code back then (over Usenet), there was not a lot of consistency in how to compile and install it. The more complicated a package was, the more likely it was to have its own unique build procedure that had to be learned first. And there were no widely used approaches to portability problems. Each software author handled them in a different way, if they did at all.

Fast forward to the present. A de facto standard is in widespread use for solving those problems, and it’s not just free software packages that are using it; some proprietary programs from the largest computer companies are built using this software. It even does Windows.

As it evolved in the 1990s it demonstrated the power of some good ideas: sharing expertise, automating repetitive work, and having consistency where it is helpful without sacrificing flexibility where it is helpful.

What is “it”? GNU Autotools, a group of utilities developed in the 1990s for the GNU Project. The authors of this book and I were some of its principal developers, but it turned out to help solve many other peoples’ problems as well, and many other people contributed to it. It is one of the many projects that developed by cooperation while making what is now often called GNU/Linux. The community made GNU Autotools widespread, as people adopted it for their own programs and extended it where they found that was needed. The creation of Libtool is that type of contribution.

Autoconf, Automake, and Libtool were developed separately, to make tackling the problem of software configuration more manageable by partitioning it. But they were designed to be used as a system, and they make more sense when you have documentation for the whole system. This book stands a level above the software packages, giving the expertise of its authors in using this whole system to its fullest. It was written by people who have lived closest to the problems and their solutions in software.

Magic happens under the hood, where experts have tinkered until the GNU Autotools engine can run on everything from jet fuel to whale oil. But there is a different kind of magic, in the cooperation and sharing that built a widely used system over the Internet, for anyone to use and improve. Now, as the authors share their knowledge and experience, you are part of the community, too. Perhaps its spirit will inspire you to make your own contributions.

David MacKenzie
Germantown, Maryland
June 2000

Interview from 2002

May 5, 2023

This is a repost of an interview with me for a web site called “GNU Friends” by the president of a Swedish foundation called FFKP, whose name means in English “The Society for Free Culture and Software”. The original article is only available now on the Wayback Machine.

Jonas Öberg, the FFKP president, did a series of interviews with prominent Free Software profiles, in 2002.

Here’s an interview with David MacKenzie, who’s been involved with many GNU projects over the years.

David, you’ve been involved in quite a few GNU projects over the years in one way or another (textutils, fileutils, shellutils, termutils, cpio, sed, patch, autoconf, and probably more that I have forgotten). A lot of this has to do with the fact that you were employed by the FSF to do just that. How was it that you became an employee with the FSF and how long were you with them?

When I was at St. Olaf College, I became good friends with Mike Haertel, a classmate who is a brilliant programmer. We shared an apartment in the summer of 1987 while working as programmers for the college on their VAX, PDP-11s, Sun 3s, and PCs. When Mike discovered Emacs and GCC, he got in touch with Richard Stallman, who hired him to work for the FSF the next summer. Mike wrote GNU grep, diff, sort, and some other programs, all big improvements on their Unix counterparts. Mike told me about the GNU Project and I got an account on their computers.

Meanwhile, I moved back to Maryland, where I’d grown up. I started working for the Environmental Defense Fund in Washington, DC as an administrative assistant and systems administrator for their Charles River Data Systems computer, which had a bunch of Wyse 50 terminals connected to it and was running their bastardized version of Unix System V called Unos. Later we replaced it with a 486 PC running SCO Xenix, then SCO Unix. After using 4.3BSD at St. Olaf, I found that there were all sorts of utilities and features missing in these versions of Unix, so I started writing better replacements to make my own job easier. (At that time, the BSD source code was covered by AT&T’s Unix license; it wasn’t freely available. And the 4.3BSD utilities had their own set of problems and limitations I wanted to improve on.) EDF, not being in the software business, allowed me to donate my programs to the GNU Project. I found rough versions of some other utilities in the FSF’s source tree and also fixed them up or finished them. Eventually, I had several good quality collections of related utility programs (fileutils, textutils, shellutils), so I started releasing them on the GNU Project’s FTP server.

Since I’d done a lot of good work as a volunteer, Richard Stallman emailed offering me a summer job working for the FSF, as they had the budget to hire several more programmers. When I went back to school, at the University of Maryland, I continued volunteering for the FSF. At one point I think I was maintaining most of their utilities. Later I got a job with Cygnus, working on the GNU binutils and writing Autoconf version 2 to meet their cross-compiling needs.

Were you involved in Free Software before you joined the FSF and if so, what was your motivation for devoting time on writing Free Software?

St. Olaf was where I discovered Usenet and the sources newsgroups, and learned C. I might have posted some dinky little programs to Usenet before I started contributing to GNU, but I don’t remember. I remember releasing some MS-DOS utilities I’d written onto a BBS network in the mid-1980s. One package was actually simple versions of some Unix utilities called Unixpg. I think I put those into the public domain. Richard Stallman was the one who raised the consciousness of many of us about what we were doing and its potential, and I started contributing to the GNU Project as soon as I had something they could use.

What do you think others’ motivation was at that time to become involved in writing Free Software for the GNU Project and how do you think this has changed over the years?

I don’t think the motivation has changed significantly. If you’ve read Steven Levy’s book Hackers, you know the reasons. Hackers want to make the computer cooler and better, and they can’t do that as well, if at all, if source code isn’t freely available. And we’re curious and love to explore and learn how things work. In my free time these days, I enjoy reading howstuffworks.com.

If we look at GNU autoconf, you worked with that software from the very first version up and until 2.12 released somewhere around 1996. How long did it take you to complete autoconf 1.0 (released in July 1992) and what do you think of autoconf development today?

Autoconf began in 1991 as a way to help me compile the GNU fileutils on several different Unixes that needed different compilation options. I was tired of editing Makefiles several times a day. Version 1 mostly grew by working an hour here, an hour there as part of developing and porting the utilities packages. In version 2 it became a real project (that is, time sink!) of its own. (Kind of like Perl was originally a tool to rewrite RN in?) I worked on that almost full time for over a month to bring it to the point where it could replace the old Cygnus configure, and I did a real design and manual. Eventually I got burned out on keeping up with other OS’s bugs and limitations, so I passed the work along to others who were still interested. One of the last things I did was prototype Automake and give some suggestions to the people who implemented it for real. I’m glad people are still keeping that software useful and up to date, and I’m glad I don’t have to do it.

During the years, some people has suggested that autoconf should be scrapped and a new system should be created. What do you think of that? Can you imagine any other configuration tool being as popular as autoconf and do you have any ideas as to what it might look like?

Ten years ago, many Unix users only had 9600 bps dialup connections to Usenet and UUCP. There were no free PC operating systems or POSIX compliant systems or package management systems. Developers had to write to a least common denominator system installation that was bare bones. If you required all these external packages in order to compile and install your little package, it took too long to get and people got frustrated and stayed away.

Nowadays, everyone’s on the Internet and many have broadband, everyone has Perl 5 and many have Python, there are automatic dependency following update systems, and everyone has POSIX and Standard C and C++. Recent Autoconf releases have stubbed out some of the tests for old limitations that you might as well assume don’t apply anymore, to make it run faster.

I and others have learned a lot since then about configuration management and software design. I found Bertrand Meyer’s book Object Oriented Software Construction to be the most eye-opening thing I’ve read in the last 5 years. It’s a pity that SmallEiffel doesn’t have a rich library like CPANbuilt up; if it did, I’d probably be using it.

In Autoconf 2 I improved the orthogonality and extensibility quite a bit over version 1. The demands put on it as well as the better operating environment that we can assume now make it past time for a version 3, but I’m not very interested in doing it myself. I followed the Software Carpentry contest as best I could, and saw that nothing much has come of it. I agree that Python would be a good language to write a new version in, though Perl is more universally installed. I don’t mean replacing m4 with one of those languages, by the way; I mean the configure scripts themselves could now be written in a decent object oriented scripting language instead of 7th edition Unix Bourne shell. I didn’t have the luxury of requiring even Perl 4 ten years ago, though I would have liked to. With the dependencies in a system like Gnome, people now accept the need to have a framework of support libraries and tools installed.

There are also ways the site preferences (default options) handling could be improved; people turn out to not want to maintain config.cache files directly, which was the mechanism I put in. Something like a “config.opts” file containing additional command line arguments for configure to process would be a good thing. Some Autoconf based packages have independently implemented mechanisms like that already.

Adding more interfaces is also a good direction to go on, like the Linux kernel’s “make xconfig” and “make menuconfig”. I may have been too much of a systems programmer to see the value in that ten years ago. Plus, it would add a lot to the code size, which would have been unacceptable then.

As to the issue of replacing or integrating with Make or Automake, it would be nice to have a cleaner system for that too, but I suppose it would be best to support both the current tools and better ones. That’s usually the GNU Project’s approach: be backward compatible, but offer improvements.

Speaking of Free Software; is there anything in particular about the work you’ve done that you’re proud of, or something you’d rather forget ever having done?

It’s fun to realize that I’ve helped millions of people on every continent have better computer systems, and probably inspired some to make contributions to free software themselves. I don’t regret anything about it.

You first got involved in computing at St. Olaf College in Minnesota, is that right? What was it about computers that you found so interesting?

I got involved with Unix there, but I started out in junior high school with 8 bit computers in 1980, programming in BASIC and assembly. At first I wrote a lot of video games, and then I started disassembling and extending the operating systems. Programming is like solving a giant ever-changing puzzle. When you solve one part, you see that another part has grown and become more interesting because of your increased knowledge and awareness.

You now work for Zero Millimeter as a consultant in Washington D.C. Could you tell us something about your work there?

Zero Millimeter is several engineers who met at the University of Maryland while working as student programmers there, and then all went to work for UUNET, then left in early 2001 to get away from the huge corporation. We do network design and server optimization work, and we can do a lot more if desired. We also contribute bug reports and patches to Mozilla, Apache, Grip, and other free software that we use. Mostly we work on BSD and Linux systems, though we touch Windows when we have to. Or to play Civilization III :-).

When you’re not at work, one of the things you do is play the guitar and do sound engineering. Among other things, you’re part of the five member garage band Ellipsis, though you never got around to releasing an album. You have however released some solo albums, is that right?

If you stretch the term, you could call them that. I made some indie tapes and CDs for friends, including a couple of Ellipsis jams.

What kind of music do you prefer, both to listen to and play yourself? And if someone wants to know what your own music sounds like, where would they turn to get their hands on your recordings?

There’s a lot of rock in my collection, especially progressive, but I also love Celtic, Appalachian, and Scandinavian folk music, ambient and techno, classical and Baroque, big band, small jazz combos, world music, hip-hop that doesn’t have obnoxious lyrics, what did I leave out? Two of my favorite artists are Phil Keaggy (guitar-based instrumentals) and Weird Al Yankovic. I love music that sets a mood and grooves. Right now my own music isn’t available anywhere, though later this year I might try to get some MP3′s online. I just got my home recording studio put back together again months after moving.

Fast, or Floppy?

October 16, 2019

In 1999, I had a DEC Celebris mini-tower for my desktop PC at UUNET. It had a Pentium Pro 200 MHz CPU, which was top of the line when the PC was purchased in 1996. But after several years, it had been far surpassed by newer PCs. PC hardware was advancing rapidly and the useful life of a PC was 3-4 years before it was obsolete.

catapult DEC PC under Dave's desk

I was managing a group of web hosting developers, but I didn’t have the budget for buying new desktop PCs. My biggest problem with my PC was that it was too slow to play MP3 files while doing anything else, and those were all the rage then. One of my co-workers had discovered Napster.

Then I learned about a CPU upgrade that adapted a Pentium II-based Celeron CPU to a Pentium Pro socket: a PowerLeap PL-PRO/II at 667 MHz, more than doubling the CPU power. I ordered a couple. A week later, they arrived at the office.

The Celebris parts weren’t fully interchangeable with ATX PCs. The CPU was on a horizontal daughterboard and the screws and plugs and I/O plates were all a little different from the standard. But the upgrade CPU fit. Except… it had a big heat sink and cooling fan that needed to be right where the floppy drive was on the Celebris. On a standard ATX PC the CPU wouldn’t have had a clearance problem.

This was before USB and CD burners became widespread, so we actually used the floppy drives. I had a choice to make: fast, or floppy? I removed the floppy drive. I didn’t have a blank cover for the hole left by the floppy drive, so it became a large ugly air intake. That’s a price I was willing to pay. My PC was useful again! And I didn’t have to reinstall or debug any software! If I ever needed the floppy drive to reinstall an OS, I could open the case door and hang it off the ribbon cable temporarily.

I installed the other upgrade CPU in a rack-mounted Celebris that needed more speed. In the next fiscal year, we bought some new Pentium III PCs, and we got spare Sun workstations running Solaris to augment them, but my upgraded DEC Celebris got me through a tight budget with only a little hassle. I’ve always hated to waste things that could still be useful. More than a decade later, I threw out my floppy disks and removed the last floppy drive from one of my PCs.