The future will be just like the past, right?

I’ve been having a bit of a retro programming session:

Z88 adventure game

The computer in the photo is a Cambridge Z88, and it won’t surprise you to know that I’ve owned it for years. However, it’s far from my first computer.

I was born less than a month before the broadcast of The Computer Programme, the television show that brought computers into many people’s living rooms for the first time. This was the dawn of the personal computer era. The Computer Programme was shown at the beginning of 1982: by the end of that year the Commodore VIC-20 had become the first computer platform ever to sell more than one million units.

My father being an early adopter (he’d already used a Commodore PET at work), we had a brand new Dragon 32 computer before I was a year old. There’s not much point doing the “hilarious” comparisons of its memory capacity and processor speed with today’s computers: the social systems into which micros were inserted and the applications to which they were put render most such comparisons meaningless.

In 1982, computers were seen by many people as the large cupboards in the back of “James Bond film” sets. They just didn’t exist for a majority of people in the UK, the US or anywhere else. The micros that supposedly revolutionised home life were, for the most part, mainly useful for hobbyists to find out how computers worked. Spreadsheets like VisiCalc might already have been somewhat popular in the business world, but anyone willing to spend $2000 on an Apple ][ and VisiCalc probably wasn’t the sort of person about to diligently organise their home finances.

Without being able to sell their computers on the world-changing applications, many manufacturers were concerned about price and designed their computers down to a level. The Register’s vintage hardware section has retrospectives on many of the microcomputer platforms from the early 1980s, many of which tell this tale. (Those that don’t tell the tale of focusing on time to market, and running out of money.) The microprocessors were all originally controllers for disk drives and other peripherals in “real” computers, repurposed as the CPUs of the micro platforms. Sinclair famously used faulty 64kB RAM chips to supply the 48kB RAM in the ZX Spectrum, to get a good price from the supplier.

So the manufacturers were able to make the hardware cheap enough that people would buy computers out of interest, but what would they then make of them? We can probably tell quite a lot by examining the media directed at home computer users. Start with The Computer Programme, as we’ve already seen that back at the beginning of the post. What you have is Ian “Mac” McNaught-Davies, positioned at the beginning of episode 1 as a “high priest” of the mainframe computer, acting as the Doctor to Chris Serle’s bemused and slightly apprehensive assistant. Serle is the perfectly ordinary man on the perfectly ordinary street, expressing (on behalf of us, the perfectly ordinary public) amazement at how a computer can turn a perfectly ordinary television set and a perfectly ordinary domestic cassette recorder into something that’s able to print poorly-defined characters onto perfectly ordinary paper.

During his perfectly ordinary tenure of ten episodes, Serle is taught to program in BBC BASIC by McNaught-Davis. In the first episode he demonstrates a fear of touching anything, confirming the spelling of every word (“list? L-I-S-T?”) he’s asked to type. If the computer requires him to press Return, he won’t do it until instructed by McNaught-Davis (thus making January 11, 1982 the first ever outing of The Return of the Mac). By the end of the series, Serle is able to get on a bit more autonomously, suggesting to Mac what the programs mean (“If temperature is more than 25, degrees I would assume…”).

Chris Serle suffered his way through nine weeks of BASIC tuition because there was no other choice for a freelance journalist to get any use out of a personal computer. Maybe as many as 8,000 hipster programmers would opt for a Jupiter Ace and the FORTH language, but for normal people it was BASIC or nothing. Even loading a game required typing the correct incantation into the BASIC prompt. Feedback was minimal because there wasn’t a lot of ROM in which to store the error messages: “Subscript at line 100” or even the Dragon’s “?BS ERROR” might be all you’re told about an error. If you didn’t have a handy McNaught-Davis around (perhaps the first user-friendly Mac in the computer field) you could easily lose ages working out what the computer thought was BS about your code.

Typing errors became manifold when using the common application distribution platform: the printed magazine. Much software was distributed as “type-ins”, often split over two (monthly) issues of a magazine: the program being presented in buggy form in one edition and an errata being supplied in the next. When you typed not one LOAD command, but a few hundred lines of BASIC in, only to find that your database program didn’t work as expected, you first had a tedious proof-reading task ahead to check that you’d typed it without error. If you had, and it still didn’t work, then out came the pencil and paper as you tried to work out what mistakes were in the listing.

Microcomputers represented seriously constrained hardware with limited application. The ability to get anything done was hampered by the primary interface being an error-prone, cryptic programming language. While the syntax of this language was hailed as simpler than many alternatives, it did nothing to smooth over or provide a soft landing for complex underlying concepts.

I’m willing to subject myself to those trials and terrors for the purpose of nostalgia. There are other people, though, who want to revert to this impression of computers as a way to get young people interested in programming. The TinyBASIC for Raspberry Pi announcement hails:

we’ve also had a really surprising number of emails from parents who haven’t done any programming since school, but who still have books on BASIC from when they were kids, remember enjoying computing lessons, and want to share some of what they used to do with their kids. It’s actually a great way to get kids started, especially if you have some enthusiasm of your own to share: enthusiasm’s contagious.

Undoubtedly there are some genuine, remembered benefits to programming on these platforms, which modern computer tuition could learn from. There was, as discussed above, no hurdle to jump to get into the programming environment. Try teaching any programming language on widely-available computing platforms today, and you’ve got to spend a while discussing what versions of what software are needed, differences between platforms, installation and so on. Almost anyone on a microcomputer could turn on, and start typing in BASIC code that would, if restricted to a limited subset of commands, work whatever they’d bought.

The cost of a “tyre-kicking” setup was modest, particularly as you could use your own TV and cassette deck (assuming you had them). Unlike many modern platforms, there was no need to have two computers tethered to program on one and run on the other, and no developer tithe to pay to the platform vendors. Where they were error-free and well documented, the type-ins gave you actually working applications that you could tweak and investigate. Such starting points are better for some learners than a blank screen and a blinking prompt.

Complete applications though these type-ins may have been, they would not satisfy the expectations of modern computer-using learners. There’s an important difference: people today have already used computers. They’re no longer magical wonder-boxes that can make a TV screen flash blue and yellow if you get the numbers correct in a PAPER command. People know what to expect from a laptop, tablet or smartphone: being able to print an endless march of RUMBELOWS IS SHIT to the screen is no longer sufficient to retain interest.

It’s not just the users of computers, nor the uses of computers, that have moved on in the last three decades. Teaching has evolved, too. There should probably be a name for the fallacy that assumes that however I was taught things is however everybody else should be taught them. A modern curriculum for novice programmers should reflect not only the technological and social changes in computing in the last thirty years, but also the educational changes. It should borrow from the positives of microcomputer programming courses, but not at the expense of throwing out a generation of evolution.

There are certainly things we can learn from the way microcomputers inspired a generation of programmers. There’s a place for ultra-cheap computers like the Raspberry Pi in modern computing pedagogy. But it would be a mistake to assume that if I gave a child my copy of “Super-Charge Your Spectrum”, that child would learn as much and be as enthused about programming as my rose-tinted model of my younger self apparently was.

About Graham

I make it faster and easier for you to create high-quality code.
This entry was posted in social-science. Bookmark the permalink.