If programmers were just more disciplined, more professional, they’d write better software. All they need is a code of conduct telling them how to work like those of us who’ve worked it out.
The above statement is true, which is a good thing for those of us interested in improving the state of software and in helping our fellow professionals to improve their craft. However, it’s also very difficult and inefficient to apply, in addition to being entirely unnecessary. In the common parlance of our industry, “discipline doesn’t scale”.
Consider the trajectory of object lifecycle management in the Objective-C programming language, particularly the NeXT dialect. Between 1989 and 1995, the dominant way to deal with the lifecycle of objects was to use the +new and -free methods, which work much like malloc/free in C or new/delete in C++. Of course it’s possible to design a complex object graph using this ownership model, it just needs discipline, that’s all. Learn the heuristics that the experts use, and the techniques to ensure correctness, and get it correct.
But you know what’s better? Not having to get that right. So around 1994 people introduced new tools to do it an easier way: reference counting. With NeXTSTEP Mach Kit’s NXReference protocol and OpenStep’s NSObject, developers no longer need to know when everybody in an app is done with an object to destroy it. They can indicate when a reference is taken and when it’s relinquished, and the object itself will see when it’s no longer used and free itself. Learn the heuristics and techniques around auto releasing and unretained references, and get it correct.
But you know what’s better? Not having to get that right. So a couple of other tools were introduced, so close together that they were probably developed in parallel[*]: Objective-C 2.0 garbage collection (2006) and Automatic Reference Counting (2008). ARC “won” in popular adoption so let’s focus there: developers no longer need to know exactly when to retain, release, or autorelease objects. Instead of describing the edges of the relationships, they describe the meanings of the relationships and the compiler will automatically take care of ownership tracking. Learn the heuristics and techniques around weak references and the “weak self” dance, and get it correct.
[*] I’m ignoring here the significantly earlier integration of the Boehm conservative GC with Objective-C, because so did everybody else. That in itself is an important part of the technology adoption story.
But you know what’s better? You get the idea. You see similar things happen in other contexts: for example C++’s move from new/delete to smart pointers follows a similar trajectory over a similar time. The reliance on an entire programming community getting some difficult rules right, when faced with the alternative of using different technology on the same computer that follows the rules for you, is a tough sell.
It seems so simple: computers exist to automate repetitive information-processing tasks. Requiring programmers who have access to computers to recall and follow repetitive information processes is wasteful, when the computer can do that. So give those tasks to the computers.
And yet, for some people the problem with software isn’t a lack of automation but a lack of discipline. Software would be better if only people knew the rules, honoured them, and slowed themselves down so that instead of cutting corners they just chose to ignore important business milestones instead. Back in my day, everybody knew “no Markdown around town” and “don’t code in an IDE after Labour Day”, but now the kids do whatever they want. The motivations seem different, and I’d like to sort them out.
Let’s start with hazing. A lot of the software industry suffers from “I had to go through this, you should too”. Look at software engineering interviews, for example. I’m not sure whether anybody actually believes “I had to deal with carefully ensuring NUL-termination to avoid buffer overrun errors so you should too”, but I do occasionally still hear people telling less-experienced developers that they should learn C to learn more about how their computer works. Your computer is not a fast PDP-11, all you will learn is how the C virtual machine works.
Just as Real Men Don’t Eat Quiche, so real programmers don’t use Pascal. Real Programmers use FORTRAN. This motivation for sorting discipline from rabble is based on the idea that if it isn’t at least as hard as it was when I did this, it isn’t hard enough. And that means that the goalposts are movable, based on the orator’s experience.
But along with the term of experience goes the breadth. You see, the person who learned reference counting in 1995 and thinks that you can only really understand programming if you manually type out your own reference-changing events, presumably didn’t go on to use garbage collection in Java in 1996. The person who thinks you can only really write correct software if every case is accompanied by a unit test presumably didn’t learn Eiffel. The person who thinks that you can only really design systems if you use the Haskell type system may not have tried OCaml. And so on.
The conclusion is that for this variety of disciplinarian, the appropriate character and quantity of discipline is whatever they had to deal with at some specific point in their career. Probably a high point: after they’d got over the tricky bits and got productive, and after you kids came along and ruined everything.
Sometimes the reason for suggesting the disciplined approach is entomological in nature, as in the case of the eusocial insect the “performant” which, while not a real word, exists in greater quantities in older software than in newer software, apparently. The performant is capable of making software faster, or use less memory, or more concurrent, or less dependent on I/O: the specific characteristics of the performant depend heavily on context.
The performant is often not talked about in the same sentences as its usual companion species, the irrelevant. Yes, there may be opportunities to shave a few percent off the runtime of that algorithm by switching from the automatic tool to the manual, disciplined approach, but does that matter (yet, or at all)? There are software-construction domains where specific performance characteristics are desirable, indeed that’s true across a lot of software. But it’s typical to focus performance-enhancing techniques on the bits where they enhance performance that needs enhancing, not to adopt them across the whole system on the basis that it was better when everyone worked this way. You might save a few hundred cycles writing native software instead of using a VM for that UI method, but if it’s going to run after a network request completes over EDGE then trigger a 1/3s animation, nobody will notice the improvement.
Anyway, whatever the source, the problem with calls for discipline is that there’s no strong motivation to become more disciplined. I can use these tools, and my customer is this much satisfied, and my employer pays me this much. Or I can learn from you how I’m supposed to be doing it, which will slow me down, for…your satisfaction? So you know I’m doing it the way it’s supposed to be done? Or so that I can tell everyone else that they’re doing it wrong, too? Sounds like a great deal.
Therefore discipline doesn’t scale. Whenever you ask some people to slow down and think harder about what they’re doing, some fraction of them will. Some will wonder whether there’s some other way to get what you’re peddling, and may find it. Some more will not pay any attention. The dangerous ones are the ones who thought they were paying attention and yet still end up not doing the disciplined thing you asked for: they either torpedo your whole idea or turn it into not doing the thing (see OOP, Agile, Functional Programming). And still more people, by far the vast majority, just weren’t listening at all, and you’ll never reach them.
Let’s flip this around. Let’s look at where we need to be disciplined, and ask if there are gaps in the tool support for software engineers. Some people want us to always write a failing test and make it pass before adding any code (or want us to write a passing test and revert our changes if it accidentally fails): does that mean our tools should not let us write code for which there’s no test? Does the same apply for acceptance tests? Some want us to refactor mercilessly; does that mean our design tools should always propose more parsimonious alternatives for passing the same tests? Some say we should get into the discipline of writing code that always reveals its intent: should the tools make a crack at interpreting the intention of the code-as-prose?
As a proponent of discipline, I see things a bit differently. You’re coming at it from the perspective of “In software, we automated X, and now X is no longer a problem, therefore, we should just keep automating as far as we can”. That’s one pattern.
I’m seeing from the perspective of: “There’s no way to automate everything, and professionals always need discipline, so let’s start with that, and then see what else can/should be automated”. My pattern is from looking at the history of every other industry in the world.
Discipline is the rule, not the exception. When cars crash, we test and license people, and issue citations for people who visibly break the rules, and increase driver education. We don’t ban manual transmissions (not enough automation!), or install breathalyzer interlocks in every vehicle. Self-driving cars may solve our remaining problems, someday, but that’s a long-term solution.
Similarly, it may be possible in software to automate our way out of many classes of bugs (I’m skeptical), but that’s a long-term solution. Discipline is something we can adopt today.
Two points you didn’t really seem to address:
First, there’s diminishing returns on technology as preventative measure. Manual refcounting is way better than malloc/free. Automatic refcounting is quite a bit better than manual refcounting. Tracing GC would be a little better than that. Every problem is like this. When you’re down at the “malloc/free” level, advocating for discipline seems silly. (Why not just fix the bad tech first? It’s not hard to improve on.) But once you’re up to about the ARC level, discipline is clearly worthwhile. You don’t need to wait for a perfect tracing GC to advocate for social solutions.
When there were only 10 cars in the whole state and no roads, licensing and education are probably not high on your list. But once you’ve got an Interstate Highway System, you need licensing, and education, and laws against drunk driving. Nobody in 1965 said “People are dying in car crashes — we’ve got to automate this by developing self-driving cars ASAP!” You look at the limiting factors. At some point, more technology is a huge cost, in time and money, and a social solution would be cheaper and more effective.
Second, you seem to be using “discipline” to mean only an internal personal strength of will, but that’s not really what the word means. Yes, we know that relying on willpower alone doesn’t scale. The goals are quality and accountability, and in almost every other field, we accomplish this through laws, regulations, certifications, trade groups, professional associations, and so on. Those do work. Check your dictionary: the definition of the word “discipline” includes rules and punishment!
Your surgeon is not performing a splenectomy while drunk, but that’s not because there’s a breathalyzer installed on the door of every operating room. We didn’t use technology to solve that problem. We used training, regulation, and (severe) penalties.
That’s what software development needs today. We can’t keep pretending that more automation is going to solve everything. You’re not going to find a new “GC” level advancement every year, because there aren’t dumb inefficiencies like “malloc/free” still sitting around. Once the low-hanging fruit of basic automation have run out, there’s no substitute for professional discipline.
So sad that this still needs to be explained. By allowing discipline to be replaced by technology, you 1) avoid the always present gaps in discipline 2) free creative capacity.
“1) everything that’s already in the world when you’re born is just normal;
2) anything that gets invented between then and before you turn thirty is incredibly exciting…
3) anything that gets invented after you’re thirty is against the natural order of things” – Douglas Adams
I agree that discipline doesn’t just mean willpower, though with the proviso that “a discipline” (e.g. plumbing or surgery) is such because you have to maintain the discipline to adopt the best practices to be admitted into its membership.
I pick plumbing and surgery because they both have two distinctions with regard to software engineering: first, they are old. And actually they were both fatal until relatively recently, too. Plumbing is named after the soluble toxin that plumbers until very recently (say, 20th century) made drinking water delivery systems out of. Surgery only recently (say, 20th century) developed _and propagated_ the techniques that reliably ensure you’re more likely to die without surgery than with. Both are millennia older than software engineering, which is merely young and fatal.
Second, both have gatekeeping. Discipline doesn’t scale, so you bar all those who aren’t willing to do things the way that’s become accepted. Malpractice isn’t about discovering that the outcome was bad, it’s about discovering that the outcome was not reached by following good practice.
You invoke the dictionary, so of course I invoke the etymological fallacy and point out that “a discipline” and “discipline” both come from the idea of learning, a “discipulos” being a student. We’ve got a lot of teaching in software engineering-adjacent fields like computer science and informatics, but not really in creating software. Whether or not it would be a good idea to have a protected profession, a guild system, or whatever barrier to entry, we don’t have that, and any attempt to define it would be met by uproar from the very practitioners who we are saying should follow the defined practices. “Software is too new,” they would cry, “to require that we all do it the same way. We’d end up with something lame like Java or OOP or MVC or UML, without the art and creative freedom we need to do our jobs well.” No, the way to become a software engineer is to contribute to the construction of software, and the way to become an experienced software engineer is to do more of that.
You can’t possibly make everybody think the same way, or to the same level, or even sufficiently, about all of the computer-related stuff that goes into software. But you can take that mental effort away from them, and leave them to think about other things like whether the customer really wants or needs this particular software.
Mate, thanks for this article. Discipline is hard to describe, and discipline encompasses many facets. I agree with your article. Developers can use discipline as an excuse to deliver slowly – even to the detriment of a business. Look at what your doing In relation to business needs. And be disciplined about delivering what is needed. Not what you want or what feels good.
I’m sympathetic to this and I think it has insight into human nature. I feel like my professional life (Mac consultant) is uniquely informed by my having coded in BASIC and 6502 on an Apple II for the first eight through twentieth years of my life. That experience, and walking through the rest of the personal computing era, allows me to feel “closer” to the modern computer I’m working on, since everything that computers do today just feels like an outgrowth of those bit-on-and-off RAM-ROM-storage fundamental. And so I feel like my instincts are sharper, more primal, if you will, because I have that basis.
But do I think that someone who is 25 can’t be a great consultant or developer, despite not having worked with computers at a less abstracted, more elemental level, at any point in their lives? I don’t, at all. The most important aspects of being a creative technical professional are resourcefulness, judiciousness, and, most critically, curiosity. With curiosity, you’ll dive deeply where you are stimulated, and where it is meaningful to your work. If that takes you to a more fundamental place than your normal level of abstraction, great. If it takes you to somewhere with greater abstraction than where you currently are, also great.
The “discipline” of which you speak is not having to understand how computers work in any particular way. To this day, I get software, but not hardware. I don’t understand the relationship of electrons to bits. I’ve tried, but it doesn’t stick. CPU’s are just fancy calculators, to me, and calculators are just made of magic, and nothing else. I’m sure it’s heresy to some who came of age in the same era that I did, or before it, but that hasn’t kept me from being among the top in my field.
The discipline one must have is working with consistency, caring about what you do and its quality, tactically indulging curiosity, and continually renewing a relentless commitment to improving. Everyone will define the specifics of those on their own terms.
What I do know is that you do not need to know how to build a machine from chips, or how to build chips from whatever chips are made from, to succeed as a developer or other technical professional. Whatever one’s own baseline of technological abstraction, knowing when and where to dive beneath it or float above it is the key. The abstraction baseline itself is irrelevant.