Programmer Values

A question and answer exchange over at programmers.stackexchange.com reveals something interesting about how software is valued. The question asked whether there is any real-world data regarding costs and benefits of test-driven development.[*] One of the answers contained, at time of writing, the anthropologist’s money shot:

The first thing that needs to be stated is that TDD does not necessarily increase the quality of the software (from the user’s point of view). […] TDD is done primarily because it results in better code. More specifically, TDD results in code that is easier to change. [Emphasis original]

Programmers are contracted, via whatever means, by people who see quality in one way: presumably that quality is embodied in software that they can use to do some thing that they wanted to do. Maybe it’s safer to say that the person who provided this answer believes that their customers value quality in software in that way, rather than make an assumption on everybody’s behalf.

This answer demonstrates that (the author believed, and thought it uncontentious enough to post without defence in a popularity-driven forum) programmers value attributes of the code that are orthogonal to the values of the people who pay them. One could imagine programmers making changes to some software that either have no effect or even a negative effect as far as their customers are concerned, because the changes have a positive effect in the minds of the programmers. This issue is also mentioned in one of the other answers to the question:

The problem with developers is they tend to implement even things that are not required to make the software as generic as possible.

The obvious conclusion is that the quality of software is normative. There is no objectively good or bad software, and you cannot discuss quality independent of the value system that you bring to the evaluation.

The less-obvious conclusion is that some form of reconciliation is still necessary: that management has not become redundant despite the discussions of self-organised teams in the Agile development community. Someone needs to mediate between the desire of the people who need the software to get something that satisfies their norms regarding quality software, and the desire of the people who make the software to produce something that satisfies their norms instead. Whether this is by aligning the two value systems, by ignoring one of them or by ensuring that the project enables attributes from both value systems to be satisfied is left as an exercise for the reader.

[*] There is at least one relevant study. No, you might not think it relevant to your work: that’s fine.

Know what counts

In Make it Count, Harry Roberts describes blacking out on stage at the end of a busy and sleepless week. Ironically, he was at the start of a talk in which he was to discuss being selective over side projects, choosing only those that you can actually “cash in” on and use to advance your career.

If you’re going to take on side projects and speaking and writing and open source and suchlike then please, make them fucking count. Do not run yourself into the ground working on ‘career moves’ if you’re not going to cash in on them. [emphasis original]

Obviously working until you collapse is not healthy. At that point, choosing which projects to accept is less important than just getting some damned sleep and putting your health back in order. In the 1950s, psychologist Abraham Maslow identified a “hierarchy of needs”, and sleep is at the base of the hierarchy meaning that, along with eating and drinking, you should take care of that before worrying about self-actualisation or esteem in the eyes of your peers.

Maslow's heirarchy, Wikipedia image

Here’s the little secret they don’t tell you in the hiring interview at Silicon Valley start-ups: you’re allowed to do things that aren’t career-centric. This includes, but is not limited to, sleeping, drinking enough water, eating non-pizza foodstuffs, having fun, seeing friends, taking breaks, and indulging in hobbies. It sometimes seems that programmers are locked in an arms race to see who can burn out first^W^W^Wdo more work than the others. That’s a short-term, economist-style view of work. I explained in APPropriate Behaviour that economists take things they don’t want to consider, or can’t work out a dollar value for, and call them “externalities” that lie outside the system.

Your health should not be an externality. Roberts attempted to internalise the “accounting” for all of his side projects by relating them in value to his career position. If you’re unhealthy, your career will suffer. So will the rest of your life. Don’t externalise your health. Worry not whether what you’re doing is good for your position in the developer community, but whether it’s good for you as a healthy individual. If you’ve got the basic things like food, shelter, sleep and safety, then validation in the eyes of yourself and your peers can follow.

The future will be just like the past, right?

I’ve been having a bit of a retro programming session:

Z88 adventure game

The computer in the photo is a Cambridge Z88, and it won’t surprise you to know that I’ve owned it for years. However, it’s far from my first computer.

I was born less than a month before the broadcast of The Computer Programme, the television show that brought computers into many people’s living rooms for the first time. This was the dawn of the personal computer era. The Computer Programme was shown at the beginning of 1982: by the end of that year the Commodore VIC-20 had become the first computer platform ever to sell more than one million units.

My father being an early adopter (he’d already used a Commodore PET at work), we had a brand new Dragon 32 computer before I was a year old. There’s not much point doing the “hilarious” comparisons of its memory capacity and processor speed with today’s computers: the social systems into which micros were inserted and the applications to which they were put render most such comparisons meaningless.

In 1982, computers were seen by many people as the large cupboards in the back of “James Bond film” sets. They just didn’t exist for a majority of people in the UK, the US or anywhere else. The micros that supposedly revolutionised home life were, for the most part, mainly useful for hobbyists to find out how computers worked. Spreadsheets like VisiCalc might already have been somewhat popular in the business world, but anyone willing to spend $2000 on an Apple ][ and VisiCalc probably wasn’t the sort of person about to diligently organise their home finances.

Without being able to sell their computers on the world-changing applications, many manufacturers were concerned about price and designed their computers down to a level. The Register’s vintage hardware section has retrospectives on many of the microcomputer platforms from the early 1980s, many of which tell this tale. (Those that don’t tell the tale of focusing on time to market, and running out of money.) The microprocessors were all originally controllers for disk drives and other peripherals in “real” computers, repurposed as the CPUs of the micro platforms. Sinclair famously used faulty 64kB RAM chips to supply the 48kB RAM in the ZX Spectrum, to get a good price from the supplier.

So the manufacturers were able to make the hardware cheap enough that people would buy computers out of interest, but what would they then make of them? We can probably tell quite a lot by examining the media directed at home computer users. Start with The Computer Programme, as we’ve already seen that back at the beginning of the post. What you have is Ian “Mac” McNaught-Davies, positioned at the beginning of episode 1 as a “high priest” of the mainframe computer, acting as the Doctor to Chris Serle’s bemused and slightly apprehensive assistant. Serle is the perfectly ordinary man on the perfectly ordinary street, expressing (on behalf of us, the perfectly ordinary public) amazement at how a computer can turn a perfectly ordinary television set and a perfectly ordinary domestic cassette recorder into something that’s able to print poorly-defined characters onto perfectly ordinary paper.

During his perfectly ordinary tenure of ten episodes, Serle is taught to program in BBC BASIC by McNaught-Davis. In the first episode he demonstrates a fear of touching anything, confirming the spelling of every word (“list? L-I-S-T?”) he’s asked to type. If the computer requires him to press Return, he won’t do it until instructed by McNaught-Davis (thus making January 11, 1982 the first ever outing of The Return of the Mac). By the end of the series, Serle is able to get on a bit more autonomously, suggesting to Mac what the programs mean (“If temperature is more than 25, degrees I would assume…”).

Chris Serle suffered his way through nine weeks of BASIC tuition because there was no other choice for a freelance journalist to get any use out of a personal computer. Maybe as many as 8,000 hipster programmers would opt for a Jupiter Ace and the FORTH language, but for normal people it was BASIC or nothing. Even loading a game required typing the correct incantation into the BASIC prompt. Feedback was minimal because there wasn’t a lot of ROM in which to store the error messages: “Subscript at line 100” or even the Dragon’s “?BS ERROR” might be all you’re told about an error. If you didn’t have a handy McNaught-Davis around (perhaps the first user-friendly Mac in the computer field) you could easily lose ages working out what the computer thought was BS about your code.

Typing errors became manifold when using the common application distribution platform: the printed magazine. Much software was distributed as “type-ins”, often split over two (monthly) issues of a magazine: the program being presented in buggy form in one edition and an errata being supplied in the next. When you typed not one LOAD command, but a few hundred lines of BASIC in, only to find that your database program didn’t work as expected, you first had a tedious proof-reading task ahead to check that you’d typed it without error. If you had, and it still didn’t work, then out came the pencil and paper as you tried to work out what mistakes were in the listing.

Microcomputers represented seriously constrained hardware with limited application. The ability to get anything done was hampered by the primary interface being an error-prone, cryptic programming language. While the syntax of this language was hailed as simpler than many alternatives, it did nothing to smooth over or provide a soft landing for complex underlying concepts.

I’m willing to subject myself to those trials and terrors for the purpose of nostalgia. There are other people, though, who want to revert to this impression of computers as a way to get young people interested in programming. The TinyBASIC for Raspberry Pi announcement hails:

we’ve also had a really surprising number of emails from parents who haven’t done any programming since school, but who still have books on BASIC from when they were kids, remember enjoying computing lessons, and want to share some of what they used to do with their kids. It’s actually a great way to get kids started, especially if you have some enthusiasm of your own to share: enthusiasm’s contagious.

Undoubtedly there are some genuine, remembered benefits to programming on these platforms, which modern computer tuition could learn from. There was, as discussed above, no hurdle to jump to get into the programming environment. Try teaching any programming language on widely-available computing platforms today, and you’ve got to spend a while discussing what versions of what software are needed, differences between platforms, installation and so on. Almost anyone on a microcomputer could turn on, and start typing in BASIC code that would, if restricted to a limited subset of commands, work whatever they’d bought.

The cost of a “tyre-kicking” setup was modest, particularly as you could use your own TV and cassette deck (assuming you had them). Unlike many modern platforms, there was no need to have two computers tethered to program on one and run on the other, and no developer tithe to pay to the platform vendors. Where they were error-free and well documented, the type-ins gave you actually working applications that you could tweak and investigate. Such starting points are better for some learners than a blank screen and a blinking prompt.

Complete applications though these type-ins may have been, they would not satisfy the expectations of modern computer-using learners. There’s an important difference: people today have already used computers. They’re no longer magical wonder-boxes that can make a TV screen flash blue and yellow if you get the numbers correct in a PAPER command. People know what to expect from a laptop, tablet or smartphone: being able to print an endless march of RUMBELOWS IS SHIT to the screen is no longer sufficient to retain interest.

It’s not just the users of computers, nor the uses of computers, that have moved on in the last three decades. Teaching has evolved, too. There should probably be a name for the fallacy that assumes that however I was taught things is however everybody else should be taught them. A modern curriculum for novice programmers should reflect not only the technological and social changes in computing in the last thirty years, but also the educational changes. It should borrow from the positives of microcomputer programming courses, but not at the expense of throwing out a generation of evolution.

There are certainly things we can learn from the way microcomputers inspired a generation of programmers. There’s a place for ultra-cheap computers like the Raspberry Pi in modern computing pedagogy. But it would be a mistake to assume that if I gave a child my copy of “Super-Charge Your Spectrum”, that child would learn as much and be as enthused about programming as my rose-tinted model of my younger self apparently was.

Are you an [X] programmer?

On my twitter bio, I describe myself as:

a Lovelacologist for portable transactators

which is, in keeping with the way I’m dressed in the avatar pic, a steampunk way of saying that I’m a programmer of mobile computers. But is that strictly true, or fair? It’s what I’ve spent most of the last couple of years doing, but then I’ve also worked on:

  • web servers
  • SMPP servers
  • one particle accelerator
  • workstation apps
  • desktop apps
  • administration scripts
  • books

and there’s bound to be more things that I haven’t remembered. I don’t think I’m alone in picking quite a narrow definition to expose as “me” (though maybe I should have thought a bit harder before titling this blog). Social scientists refer to this as “doing identity work”, the effort we go to to control the definition of who we are in interactions with others. To confirm this choice of narrow identity work, here’s a not-quite-random look at excerpts from a few other Twitter bios (anonymised for no real reason):

  • iOS, OS X, BMWs, photography, and food.
  • App developer by day – Apple fanboy by night
  • now a Clojure and Ruby programmer
  • iOS Developer

It’s interesting that while we choose these restricted “brands” for ourselves, we actually spend a lot of time solving the same problems. I’ve been working on another web app project lately, and it’s remarkably similar to building a mobile app. Even a lot of the constraints are similar:

  • keep the event loop fast
  • avoid loading lots of large data files
  • maintain separation of concerns between modules
  • try to minimise power consumption

and indeed the solutions turn out to be similar too. The command bus introduced in an earlier post, perfect for (and, according to some readers, informing their own work in) mobile apps, was actually built for this web app project. The problems and the solutions turn out to be interchangeable.

What we need is more of this interchangeability. Rather than waiting for a mobile person to say in a mobile way how to write mobile software, we can take advantage of what people have already said in a software way about how to write software. I have resolved to widen my horizons, and pay broader attention to what my colleagues are up to.

APPropriate Behaviour is complete!

APPropriate Behaviour, the book on things programmers do that aren’t programming, is now complete! The final chapter – a philosophy of software making – has been added, concluding the book.

Just because it’s complete, doesn’t mean it’s finished: as my understanding of what we do develops I’ll probably want to correct things, or add new anecdotes or ideas. Readers of the book automatically get free updates whenever I create them in the future, so I hope that this is a book that grows with us.

As ever, the introduction to the book has instructions on joining the book’s Glassboard to discuss the content or omissions from the content. I look forward to reading what you have to say about the book in the Glassboard.

While the recommended purchase price of APPropriate Behaviour is $20, the minimum price now that it’s complete is just $10. Looking at the prices paid by the 107 readers who bought it while it was still being written, $10 is below the median price (so most people chose to pay more than $10) and the modal price (so the most common price chosen by readers was higher than $10).

A little about writing the book: I had created the outline of the book last Summer, while thinking about the things I believed should’ve been mentioned in Code Complete but were missing. I finally decided that it actually deserved to be written toward the end of the year, and used National Novel Writing Month as an excuse to start on the draft. A sizeable portion of the draft typescript was created in that month; enough to upload to LeanPub and start getting feedback on from early readers. I really appreciate the help and input those early readers, along with other people I’ve talked to the material about, have given both in preparing APPropriate Behaviour and in understanding my career and our industry.

Over the next few months, I tidied up that first draft, added new chapters, and extended the existing material. The end result – the 11th release including that first draft – is 141 pages of reflection over the decade in which I’ve been paid to make software: not a long time, but still nearly 15% of the sector’s total lifespan. I invite you to grab a copy from LeanPub and share in my reflections on that decade, and consider what should happen in the next.

Happy Birthday, Objective-C!

OK, I have to admit that I actually missed the party. Brad Cox first described his “Object-Oriented pre-compiler”, OOPC, in The January 1983 issue of ACM SIGPLAN Notices.

This describes the Object Oriented Pre-Compiler, OOPC, a language and a run-time library for producing C programs that operate by the run-time conventions of Smalltalk 80 in a UNIX environment. These languages offer Object Oriented Programming in which data, and the programs which may access it, are designed, built and maintained as inseparable units called objects.

Notice that the abstract has to explain what OOP is: these were early days at least as far as the commercial software industry viewed objects. Reading the OOPC paper, you can tell that this is the start of what became known as Objective-C. It has a special syntax for sending Smalltalk-style messages to objects identified by pointers to structures, though not the syntax you’ll be used to:

someObject = {|Object, "new"|};
{|myArray, "addObject:", someObject|};

The infix notation [myArray addObject:someObject]; came later, but by 1986 Cox had published the first edition of Object-Oriented Programming: An Evolutionary Approach and co-founded Productivity Products International (later Stepstone) to capitalise on the Objective-C language. I’ve talked about the version of ObjC described in this book in this post, and the business context of this in Software ICs and a component marketplace.

It’s this version of Objective-C, not OOPC, that NeXT licensed from PPI as the basis of the Nextstep API (as distinct from the NEXTSTEP operating system: UNIX is case sensitive, you know). They built the language into a fork of the GNU Compiler Collection, and due to the nature of copyleft this meant they had to make their adaptations available, so GCC on other platforms gained Objective-C too.

Along the way, NeXT added some features to the language: compiler-generated static instances of string classes, for example. They added protocols: I recorded an episode of NSBrief with Saul Mora discussing how protocols were originally used to support distributed objects, but became important design tools. This transformation was particularly accelerated by Java’s adoption of protocols as interfaces. At some (as far as I can tell, not well documented) point in its life, Stepstone sold the rights to ObjC to NeXT, then licensed it back so they could continue supporting their own compiler.

There isn’t a great deal of change to Objective-C from 1994 for about a decade, despite or perhaps due to the change of stewardship in 1996/1997 as NeXT was purchased by Apple. Then, in about 2003, Apple introduced language-level support for exceptions and critical sections. In 2007, “Objective-C 2.0” was released, adding a collection enumeration syntax, properties, garbage collection and some changes to the runtime library. Blocks—a system for supporting closures that had been present in Smalltalk but missing from Objective-C—were added in a later release that briefly enjoyed the name “Objective-C 2.1”, though I don’t think that survived into the public release. To my knowledge 2.0 is the only version designation any Apple release of Objective-C has had.

Eventually, Apple observed that the autozone garbage collector was inappropriate for the kind of software they wanted Objective-C programmers to be making, and incorporated reference-counted memory management from their (NeXT’s, initially) object libraries into the language to enable Automatic Reference Counting.

And that’s where we are now! But what about Dr. Cox? Stepstone’s business was not the Objective-C language itself, but software components, including ICPak101, ICPak201 and the TaskMaster environment for building applications out of objects. It turned out that the way they wanted to sell object frameworks (viz. in a profitable way) was not the way people wanted to buy object frameworks (viz. not at all). Cox turned his attention to Digital Rights Management, and warming up the marketplace to accept pay-per-use licensing of digital artefacts. He’s since worked on teaching object-oriented programming, enterprise architecture and other things; his blog is still active.

So, Objective-C, I belatedly raise my glass to you. You’re nearly as old as I am, and that’s never likely to change. But we’ve both grown over that time, and it’s been fun growing up with you.

Does the history of making software exist?

A bit of a repeated theme in the construction of APPropriate Behaviour has been that I’ve tried to position certain terms or concepts in their historical context, and found it difficult, or impossible to do so with sufficient rigour. There’s an extent to which I don’t want the book to become historiographical so have avoided going too deep into that angle, but have discovered that either no-one else has or that if they have, I can’t find their work.

What often happens is that I can find a history or even many histories, but that these aren’t reliable. I already wrote in the last post on this blog about the difficulties in interpreting references to the 1968 NATO conference; well today I read another two sources that have another two descriptions of the conference and how it kicked off the software crisis. Articles like that linked in the above post help to defuse some of the myths and partisan histories, but only in very specific domains such as the term “software crisis”.

Occasionally I discover a history that has been completely falsified, such as the great sequence of research papers that “prove” how some programmers are ten (or 25, or 1000) times more productive than others or those that “prove” bugs cost 100x more to fix in maintenance. Again, it’s possible to find specific deconstructions of these memes (mainly by reading Laurent Bossavit), but having discovered the emperor is naked, we have no replacement garments with which to clothe him.

There are a very few subjects where I think the primary and secondary literature necessary to construct a history exist, but that I lack the expertise or, frankly, the patience to pursue it. For example you could write a history of the phrase “software engineering”, and how it was introduced to suggest a professionalism beyond the craft discipline that went before it, only to become a symbol of lumbering lethargy among adherents of the craft discipline that came after it. Such a thing might take a trained historian armed with a good set of library cards a few years to complete (the book The Computer Boys Take Over covers part of this story, though it is written for the lay reader and not the software builder). But what of more technical ideas? Where is the history of “Object-Oriented”? Does that phrase mean the same thing in 2013 as in 1983? Does it even mean the same thing to different people in 2013?

Of course there is no such thing as an objective history. A history is an interpretation of a collection of sources, which are themselves interpretations drawn from biased or otherwise limited fonts of knowledge. The thing about a good history is that it lets you see what’s behind the curtain. The sources used will all be listed, so you can decide whether they lead you to the same conclusions as the author. It concerns me that we either don’t have, or I don’t have access to, resources allowing us to situate what we’re trying to do today in the narrative of everything that has gone before and will go hence. That we operate in a field full of hype and innuendo, and lack the tools to detect Humpty Dumptyism and other revisionist rhetoric.

With all that said, are the histories of the software industry out there? I don’t mean the collectors like the museums, who do an important job but not the one I’m interested in here. I mean the histories that help us understand our own work. Do degrees in computer science, to the extent they consider “real world” software making at all, teach the history of the discipline? Not the “assemblers were invented in 1949 and the first binary tree was coded in 19xy” history, but the rise and fall of various techniques, fads, disciplines, and so on? Or have I just volunteered for another crazy project?

I hope not, I haven’t got a good track record at remembering my library cards. Answers on a tweet, please.