An Imagined History of Object-Oriented Programming

Having looked at hopefully modern views on Object-Oriented analysis and design, it’s time to look at what happened to Object-Oriented Programming. This is an opinionated, ideologically-motivated history, that in no way reflects reality: a real history of OOP would require time and skills that I lack, and would undoubtedly be almost as inaccurate. But in telling this version we get to learn more about the subject, and almost certainly more about the author too.
They always say that history is written by the victors, and it’s hard to argue that OOP was anything other than victorious. When people explain about how they prefer to write functional programs because it helps them to “reason about” code, all the reasoning that was done about the code on which they depend was done in terms of objects. The ramda or lodash or Elm-using functional programmer writes functions in JavaScript on an engine written in C++. Swift Combine uses a functional pipeline to glue UIKit objects to Core Data objects, all in an Objective-C IDE and – again – a C++ compiler.

Maybe there’s something in that. Maybe the functional thing works well if you’re transforming data from one system to another, and our current phase of VC-backed disruption needs that. Perhaps we’re at the expansion phase, applying the existing systems to broader domains, and a later consolidation or contraction phase will demand yet another paradigm.
Anyway, Object-Oriented Programming famously (and incorrectly, remember) grew out of the first phase of functional programming: the one that arose when it wasn’t even clear whether computers existed, or if they did whether they could be made fast enough or complex enough to evaluate a function. Smalltalk may have borrowed a few ideas from Simula, but it spoke with a distinct Lisp.

We’ll fast forward through that interesting bit of history when all the research happened, to that boring bit where all the development happened. The Xerox team famously diluted the purity of their vision in the hot-air balloon issue of Byte magazine, and a whole complex of consultants, trainers and methodologists jumped in to turn a system learnable by children into one that couldn’t be mastered by professional programmers.
Actually, that’s not fair: the system already couldn’t be mastered by professional programmers, a breed famous for assuming that they are correct and that any evidence to the contrary is flawed. It was designed to be learnable by children, not by those who think they already know better.

The result was the ramda/lodash/Elm/Clojure/F# of OOP: tools that let you tell your local user group that you’ve adopted this whole Objects thing without, y’know, having to do that. Languages called Object-*, Object*, Objective-*, O* added keywords like “class” to existing programming languages so you could carry on writing software as you already had been, but maybe change the word you used to declare modules.

Eventually, the jig was up, and people cottoned on to the observation that Object-Intercal is just Intercal no matter how you spell come.from(). So the next step was to change the naming scheme to make it a little more opaque. C++ is just C with Classes. So is Java, so is C#. Visual BASIC.net is little better.

Meanwhile, some people who had been using Smalltalk and getting on well with fast development of prototypes that they could edit while running into a deployable system had an idea. Why not tell everyone else how great it is to develop prototypes fast and edit them while running into the deployable system? The full story of that will have to wait for the Imagined History of Agile, but the TL;DR is that whatever they said, everybody heard “carry on doing what we’re already doing but plus Jira”.

Well, that’s what they heard about the practices. What they heard about the principles was “principles? We don’t need no stinking principles, that sounds like Big Thinking Up Front urgh” so decided to stop thinking about anything as long as the next two weeks of work would be paid for. Yes, iterative, incremental programming introduced the idea of a project the same length as the gap between pay checks, thus paving the way for fire and rehire.

And thus we arrive at the ideological void of today’s computering. A phase in which what you don’t do is more important than what you do: #NoEstimates, #NoSQL, #NoProject, #NoManagers…#NoAdvances.

Something will fill that void. It won’t be soon. Functional programming is a loose collection of academic ideas with negation principles – #NoSideEffects and #NoMutableState – but doesn’t encourage anything new. As I said earlier, it may be that we don’t need anything new at the moment: there’s enough money in applying the old things to new businesses and funnelling more money to the cloud providers.

But presumably that will end soon. The promised and perpetually interrupted parallel computing paradigm we were guaranteed upon the death of Moore’s Law in the (1990s, 2000s, 2010s) will eventually meet the observation that every object larger than a grain of rice has a decent ARM CPU in, leading to a revolution in distributed consensus computing. Watch out for the blockchain folks saying that means they were right all along: in a very specific way, they were.

Or maybe the impressive capability but limited applicability of AI will meet the limited capability but impressive applicability of intentional programming in some hybrid paradigm. Or maybe if we wait long enough, quantum computing will bring both its new ideas and some reason to use them.

But that’s the imagined future of programming, and we’re not in that article yet.

About Graham

I make it faster and easier for you to create high-quality code.
This entry was posted in OOP and tagged . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.