The topic for this episode is Niklaus Wirth’s A Plea for Lean Software.
The episode is sponsored by…your generous support. Head over to https://www.patreon.com/chironcodex/redeem/A31E3 to get a free month of Insider access to my Patreon, with my gratitude!
Links:
- Project Oberon
- Sophie Wilson: The Future of Microprocessors
- Type-checking for JavaScript: TypeScript and Flow.
- Principia Softwarica
- Operating Systems: Design and Implementation
- The Various Meanings of Quality
- Follow me on BlueSky!
Episode Transcript
(Music plays)
Welcome to the Structure and Interpretation of Computer Programmers podcast, episode 56. This episode is about “A Plea for Lean Software,” an article that Niklaus Wirth wrote in the IEEE’s Computer magazine in 1995.
This episode is brought to you by you, the community who support my work through the Chiron Codex Patreon and your gifts to the Chiron Codex Ko-fi account.
Let’s talk a little bit about Niklaus Wirth’s career. He’s perhaps most famous for creating the Pascal programming language back in 1970. Pascal is designed to support structured approaches to programming, with procedures that operate on record types and dynamic types including lists. Pascal’s type system includes features that have recently come back into vogue, including strong types where a pointer to any type is incompatible with a pointer to another type.
I mostly encountered Pascal in my Amiga days, as a language for learning about computers. It was used a lot in teaching contexts, including my university computing labs, though that was on NEXTSTEP and long after I had learned how to write Pascal software, and for writing simple applications. Pascal’s record type made it easy to create random access binary files in the days when you didn’t have a separate database management system like PostgreSQL, or even an integrated one like SQLite, and the Berkeley DB wasn’t yet widely used.
Pascal was adopted by Apple and augmented with object-oriented features, first as Clascal and then Object Pascal, and became the basis for the MacApp framework for building classic Mac OS software. Object Pascal itself was integrated into Turbo Pascal and Delphi, and was at the core of the rapid application development movement in the 1990s.
Wirth hadn’t stayed still though, and had added modules to Pascal to create Modula, then co-routines and other features to create Modula-2. In the 1990s, he co-created Oberon, both a programming language that supports data type inheritance—in which the data types themselves collaborate to define how or if they share implementations in their shared interfaces—and an operating system written almost entirely in the Oberon programming language. In a nod to the power of self-describing systems like Smalltalk, Wirth had spent two sabbaticals at Xerox PARC. Before his 80th birthday in 2013, Wirth updated Oberon to run on a CPU instruction set architecture that he had designed himself.
This brings us back to the article, because Wirth came up with three principles for software design based on his experiences with Jürg Gutknecht building Oberon. “A Plea for Lean Software” introduces these principles and derives nine lessons from them. Before I introduce those, the summary of the article that the magazine sub-editor put in a sidebar is a good description of what Wirth is looking for in so-called lean software:
“Software’s girth has surpassed its functionality, largely because hardware advances make this possible. The way to streamline software lies in disciplined methodologies and a return to the essentials.”
Wirth was frustrated that a text editor written in the 1970s used about 8 kilobytes of storage, and a text editor written in the 1990s used 800 kilobytes, but only has the same capabilities. He described a law of software that we now call Wirth’s Law, but that he attributes in the article to Martin Reiser:
“Software is getting slower more rapidly than hardware becomes faster.”
Now I have two asides here. The first is that Sophie Wilson did a great talk where she talks about the amount that hardware has actually become faster, which has been decelerating for a long while now; there’s a link in the show notes. My second aside is that this situation, where Reiser’s Law has become Wirth’s Law, is one of the reasons I enjoy going back to these important texts in these podcasts and on my blog. Working out who said the thing and what they actually said usually means that we end up with a clearer idea of the thought they were trying to convey than the telephone game where people redefine ideas in software to support their current way of working or denigrate somebody else. In this case, Wirth is quoting a colleague he directly worked with, so it isn’t too bad, but eventually ideas seem to get homeopathically diluted into nothingness through the retelling.
Okay, so no more beating about the bush. We promised three principles of software creation, and here they are:
- First, concentrate on the essentials. Oberon is a text user interface because the creators considered graphics and icons to be superfluous to the goal of contributing power and flexibility. The deeper message is to identify the things that people need to do and commit only to delivering those things. If a small percentage of your users want to do some other things, make the system flexible and extensible so they can get that, but don’t make everybody manage those features mentally or physically.
- Second, use a type-safe object-oriented language. The benefits of type safety were a smaller team size, fewer problems generated in work or rework, and as a result, faster development and rework.
- Third, flexible extensibility. Design a system so that new features can be added by creating modules that combine operations supplied in existing modules, or by adding new data types that are compatible with existing operations that work on existing data types.
The article introduces nine lessons that Wirth and Gutknecht learned from their work on Oberon, which they contrasted with the way mainstream software development carried on, so let’s take a look at those lessons next.
(Music plays)
This is the advert break. It starts now.
This episode is brought to you by me, Graham Lee, but really by you. Chiron Codex is a community of people who are learning how to become better software engineers by adopting AI augmentation in a thoughtful way. We aren’t outsourcing our understanding to coding assistants like Claude or Codex, but becoming software engineering centaurs by using AI tools to improve our knowledge and the quality of our work.
Join the community over on Patreon to find out about interaction patterns that improve your work with AI coding tools, running LLMs for software development locally, discussions of recent research in the field, and more. If you’re a software engineer who’s interested in the promise of AI tools but skeptical about handing your skills over to the computer, this is the community for you. Go to patreon.com/chironcodex—that’s C-H-I-R-O-N-C-O-D-E-X—now for more information and to join.
Use the gift link in the show notes to get your first month of insider access completely free. Alternatively, you can show your appreciation by donating at Ko-fi, that’s ko-fi.com/chironcodex, K-O-hyphen-F-I dot com. Direct support by my audience is the only revenue I get for my work as a software engineer and communicator, so your support really means a lot to me and makes it possible for me to produce this podcast. Thank you so much.
That was the advert break. It’s over now.
(Music plays)
Wirth’s first lesson is:
“The exclusive use of a strongly typed language was the most influential factor in designing this complex system in such a short time.”
This is one lesson that I know hasn’t been universally learned. I’ve been on both sides of the argument, too. As somebody who really enjoys working with Smalltalk and Objective-C, I don’t feel less productive working with dynamic types. I also point out that Smalltalk has a single type, “object,” therefore all of its expressions type-check very nicely. Because I enjoy it, I might even be more productive in the sense that I’m willing to work more and procrastinate less because the work’s enjoyable.
However, I’ve also worked on JavaScript software where I’ve seen the problems caused by objects that have incompatible shapes being discovered at runtime. Advocating for incrementally adopting a type-checking mechanism to catch exactly those errors—I suggested either TypeScript or Flow—I experienced a revolution from the developers, with one of them saying that if they ever had to understand what covariance and contravariance are again, they would leave the company. I might suggest that they do need to understand those things, even if they choose to use tools that don’t surface them. I can believe that strongly typed languages give the benefits Wirth claims, while also believing that nobody believes that they do, including, hypocritically speaking, myself. Given that Python and JavaScript are still two of the more popular programming languages in the world—and if you’re being uncharitable you might want to include C—there’s a lot of convincing to be done and a lot of inertia and potentially legacy code to account for.
Lesson two:
“The most difficult design task is to find the most appropriate decomposition of the whole into a module hierarchy, minimizing function and code duplications.”
I think that this is an evergreen statement about software design. Refactoring, which—going back to the whole type safety debate—comes from the Smalltalk world, gives us a way to deal with this a piece at a time by applying incremental design repair. I don’t think that’s what Wirth had in mind, and indeed he has a section in the article explaining that developers never have enough time to do the efficient designs because we’re always pressured to add more features, further bloating already inefficient software. So maybe while I think of refactoring as a way to tidy as we go, he might have thought of it as a way to kick the can down the road.
That’s an important piece of context about this article in itself. Oberon wasn’t developed by a stealth startup running along on a small bit of seed funding. It was the work of a tenured professor and an assistant professor at ETH Zurich. I don’t know what the funding landscape was like for Swift’s research institutes in the mid-to-late 1980s, but I do expect them to be different to those in corporate software development, based on my experience in early 2020s academia.
Lesson three:
“Oberon’s type extension construct was essential for designing an extensible system wherein new modules added functionality and new object classes integrated compatibly with the existing classes or data types.”
In other words, having come up with a minimally duplicative set of expressive primitives, the way to keep the rest of the system efficient is to be able to design the correct compositions of those primitives into richer applications. The details of the paragraph on this lesson say “without access to the source code,” so this lesson is really about correctly designing interfaces so that people can see the expected way to use them and then use them in the expected way to achieve their own goals.
Lesson four:
“In an extensible system, the key issue is to identify those primitives that offer the most flexibility for extensions while avoiding a proliferation of primitives.”
It’s entirely possible that I’ve misunderstood lesson three, because I thought that covered the sentence I just read out. Maybe I just inferred the existence of correctly designed primitives—the stuff of lesson four—in the need to have a good mechanism for safely composing them—the stuff of lesson three.
Lesson five:
“The belief that complex systems require armies of designers and programmers is wrong. A system that is not understood in its entirety, or at least to a significant degree of detail, by a single individual should probably not be built.”
This strikes me as one of those arrogant-sounding European computer science academic quotes to which Alan Kay would respond by citing his probably apocryphal paper entitled “On the Fact That Most Software is Written on One Side of the Atlantic.” First, it’s clear that one feature of the project Wirth and Gutknecht created that makes it easier for one or two people to understand it is that it’s intentionally restricted in capabilities. It’s a system that you can build applications in, like Smalltalk, not a system you can apply to things. Okay, we can still accept the point of this article that much software complexity is incidental and down to the way that the software teams are run, rather than being essential parts of the software, and we can still maybe aspire to this goal. But we can also move the goalposts to fit or not fit this lesson at will, which means finding difficulty in applying it. Does one person understand the Minix operating system or Plan 9? Definitely. In fact, someone recently published a complete guide to Plan 9, and Andy Tanenbaum has always done the same for Minix. How about a GNU/Linux operating system of your choosing? Probably not. How about the bits of the GNU/Linux operating system that implement the same behavior as Minix? Possibly, but is that a useful boundary?
If I’m going to come up with so many questions, I think I need to provide some kind of answer. And here I’ll appeal to Alan Kay’s idea of recursive modularity and Conway’s Law, connecting software architecture and organization structure, letting me say that the only reasonable request Wirth can be making is that the scope and design of a software system undertaken by any particular team should be comprehensible by a small number of people, and that taking the other lessons into account, it should have an extensible design with well-considered primitives and a flexible interface that allows a small number of other people to incorporate it into their design. Otherwise, this question degenerates to working out how many angels can dance on the head of a pin, or perhaps closer to our topic, how many pins you can harmlessly stab an angel with.
Lesson six:
“Communication problems grow as the size of the design team grows.”
This is the topic of The Mythical Man-Month and the insight directly behind Brooks’s Law. The number of communication routes grows combinatorially with the number of communicators, so adding people to a project increases the amount of communication on the project and slows down progress. Given that Wirth was writing in 1995, 20 years after Brooks published The Mythical Man-Month, and didn’t cite that book in this article, I think it both charitable and likely to assume that Wirth hadn’t read the earlier work and had independently observed this communication issue.
Lesson seven:
“Reducing complexity and size must be the goal in every step, in system specification, design, and detailed programming.”
I suspect that this is where Wirth found the most deviation between reality and expectation, because most people have remaining employed as the goal and translate that into a locally relevant outcome that their business, university, research facility, or charity’s management needs. I can see three ways to make reduced size and complexity into the more important goal over deliverability of profitable features, research outputs, or other organizational value.
The first is to remove the organization and make pursuit of minimalist software a hobby. Undoubtedly, many people in the free software world are doing this, and undoubtedly for the most part, that software isn’t going mainstream. When I think of the software people use to record and publish podcasts, for example, I don’t necessarily instantly think, “Oh yes, Audacity and WordPress, famously lean software.”
The second is to change the programmer-sponsor relationship from employee-employer to agent-client, become a licensed profession, make lean software a moral imperative, and threaten people with loss of license—i.e., with an inability to do the work—if they write software in another way. That’s such a revolutionary approach with so many downsides that I honestly can’t see it working.
The third approach is to rebalance the short-term and long-term needs of an organization and to create simple small programs because it’s in the best interests of the sponsors to have simple small programs. Small, to remain faster and more efficient than competing software; simple, to make future flexibility easier. This is the kind of principle upon which we can probably all agree, and yet in practice will rarely see enacted. For example, going back to that 8-kilobyte text editor that became an 800-kilobyte text editor: the built-in text editor on my computer now uses 48.9 megabytes for a single small file open, and honestly that’s more a result of creative accounting in the resource monitoring tool than an accurate reflection of the situation.
Lesson eight:
“Organizing a team into managers, designers, programmers, analysts, and users is detrimental. All should participate with differing degrees of emphasis in all aspects of development.”
This is a lesson that gets heard and ignored over and over again. One of the principles of agile software development is that the best architectures, requirements, and designs emerge from self-organizing teams, again observing that the external organization of people into teams is detrimental. Mob programming, which would be described as all the brilliant people working on the same thing at the same time in the same space and at the same computer, doesn’t even give managers the opportunity to corral people into specific roles.
But we need to understand a dichotomy here. Lesson six tells us to minimize communication lines, and the goal of organizing people into roles is to design interfaces and activities that constrain data flow to tame the communication problem. If you saw a software system designed along mob programming lines—all the brilliant functions working on the same thing at the same time in the same space and on the same computer—you’d think the designer had taken leave of their senses, and definitely Wirth would not believe his earlier lessons had been applied. However, if we accept teams of humans as software is different, so of course you structure them differently, then we need to give up on Conway’s Law, that the software architecture necessarily reflects the org chart. I so far haven’t participated in mob programming; indeed, it’s been six or more years since I’ve been in the same room as any of my colleagues, let alone all of them. People I’ve spoken to who try it are generally very much in favor of the practice, but I don’t have the experience needed to give a judgment. It does address lesson eight by removing the pigeonholes of different roles on the team.
Lesson nine:
“Programs should be written and polished until they acquire publication quality.”
Wirth goes on to say that this is infinitely more demanding than writing a program that runs, and that it “contradicts certain vested interests in the commercial world.” It makes it sound like he’s promoting a populist conspiracy where there are hidden influences in commercial software that stop people from being able to write publication-quality software. When the reality is that the commercial world doesn’t have infinite resources to throw at an infinitely demanding problem that doesn’t make them any money.
Unfortunately, this is the missing keystone that causes Wirth’s whole argument to collapse. He spends a page in the section on causes for fat software explaining why commercial software is bloated. Each customer uses a different subset of the features, so more features lead to more customers. Complexity gets mistaken for power, and time pressure pushes people to be first to market rather than to do their best engineering. He then gives an example of creating software without those pressures—the Oberon system which has no customers, a focus on simplicity, and no market to reach—and asks why everybody else can’t just do it like that. The missing piece is the part where either his suggestions are shown to remove the pressures that cause fat software, or at least to mitigate their symptoms. Either lean software makes it easier to add features, demonstrate power, or help people get to market quickly, or it completely upends market dynamics so that those factors aren’t important anymore. If neither of those is true, then this whole piece is nothing more than a call to adopt a principle because some Swiss academic would prefer it if you could all please do that, thank you very much.
Software is a very broad and inclusive discipline with low entry requirements. Indeed, I and many people in my generation had no relevant qualifications other than an interest in making computers do things and a stubborn determination to type in programs that were shared in magazines. As such, trying to instill a universal discipline is doomed to failure, especially one that works against getting software out to market. I’ve made the case before in the De Programmatica Ipsum issue on quality that software is a lemon market. Nobody knows what quality your software is before they try it, so nobody has any way to evaluate software by quality, so nobody believes software has any quality, which pushes prices down, which pushes affordable costs down, which pushes quality down, which pushes belief about quality down, and so on. That’s why the pressures Wirth discussed in 1995 still apply today.
This situation hasn’t stopped people from hoping that everybody would just adopt one simple trick that merely requires continuous discipline and perhaps a degree in mathematics. From Dijkstra to the pre-Cambrian monad tutorial explosion of the 2010s to the Rust evangelism strike force. Very infrequently, a particular practice becomes a meme and gets sort of somewhat maybe adopted broadly in a way that the originators wish it hadn’t been. Find me an agilist who’s happy with the way software companies do “agile”—and yes, I used scare quotes there—or the way everybody does TDD—scare quotes again—or doesn’t mind that Smalltalk isn’t more widely adopted. These people go on the circuit giving conference talks saying, “No, what we meant was,” and they do this to the kinds of conference that will book them, which are the kinds that have an audience who already believe the thesis of the talk, so not the people who allegedly need to hear their advice. They post snarky articles on LinkedIn about how nobody else in the software industry truly gets it and you should hire them if you want to be one of the few teams who do software properly. I know this because I was that person, particularly during my burnout phase a little over a decade ago; in some ways, I still am.
Thank you for listening. Please find one other person who would enjoy this podcast, tell them how much you liked it, and share a link to the podcast with them. If you’re able, support the podcast and my other work sharing software engineering information and insight on Patreon or Ko-fi. You can follow me on BlueSky at iamleeg.bluesky.social—that’s I-A-M-L-E-E-G dot bluesky dot social—or email me: grahamlee@acm.org. Thank you again, and I will talk to you soon.
(Music plays)


