The First Flaw

As she left her desk at the grandiosely-named United States Robotics, Susan reflected on her relationship with the engineering team she was about to meet. Many of its members were juvenile and frivolous in her opinion, and she refused to play along with any of their jokes.

Even the title they gave her was mocking. They called her “the robopsychologist,” a term with no real meaning as USR had yet to make a single product. They had not even sold any customers on the promise of a robot. All they had to their name was a rented office, some venture capital and their founder’s secret recipe that was supposed to produce an intelligent sponge from a mixture of platinum and iridium.

While Susan might not be the robopsychologist, she certainly was a psychologist, of sorts. She seemed to spend most of her time working out what was wrong with the people making the robots, and how to get them to quit goofing off and start making this company some much needed profit. Steeling herself for whatever chaotic episode this dysfunctional group was going through, she opened the door to the meeting room. She was waved to a seat by director of product development Roger Meadows.

“Thanks for coming, Doctor Ca-“

Susan cut him off. “You’ve called me Susan before, Roger, you can do it again now. What’s up?”

“It’s Pal. We’ve lost another programmer.”

Susan refused to call their unborn (and soon stillborn, if the engineers didn’t buck up soon) product by its nickname, short for Proprietary Artificial Lifeform. She had at least headed off a scheme to call it “Robot 2 Delegate 2”, which would have cost their entire budget before they even started.

“Well, look. I know it’s hard, but those kids work too much and burn themselves out. Of course they’re going to quit if-“

“No, I don’t mean that. Like I said, it’s Pal. He killed Tanya.”

“Killed?” Susan suddenly realised how pale Roger looked, and that she had probably just gone a similar hue. “But how, no, wait. You said another programmer?”

“Er, yes. I mean, first Pal got Steve, but we though, you know, that we could, uh, keep that quiet until the next funding round, so-“

The blood suddenly came back to Susan’s face. “Are you telling me,” she snapped, “that two people had to die before you thought to ask for any help? Did you come to me now because you’re concerned, or because you’ve run out of programmers?”

“Well, you know, I’d love to recruit more, but as they say, adding people to a late project…”

Yes, thought Susan, I do know what they say. You can boil the whole programming field down to damned aphorisms like that one. Probably they just give you a little phrasebook in CS101 and test you on it after three years, see if you have them all down pat.

“But what about the ethics code? Isn’t there some module in that positronic brain you’ve built to stop that sort of thing happening?”

“Of course, the One Law of Robotics. The robot may not harm a human being. That was the first story we built. We haven’t added the inaction thing the VCs wanted, but that can’t be it. That mess in the lab was hardly the result of inaction.”

“Right, the lab. I suppose I’d better go down and see for myself.”


She quite quickly wished she hadn’t. Despite having a strong constitution, Susan’s stomach turned at the sight of barely-recognisable pieces of former colleague. Or possibly colleagues, she wasn’t convinced Roger would have let a cleaner in between accidents.

The robot had evidently launched itself directly at and subsequently through Tanya, stopping only when the umbilical connecting it to the workstation had become disconnected, removing its power source. Outwardly and, Susan knew, internally, it lay dormant in its new macabre gloss coat.

“I take it you did think to do a failure analysis? Do you know what happened here?”

“If I knew that, Susan, I wouldn’t have gotten you involved.” She believed it, knowing her reputation at USR. “We’ve checked the failure tree and no component could cause the defect mode seen here.”

“Defect mode! Someone’s dead, Roger! Two people! People you’re responsible for! Look, it went wrong somehow, and you’re saying it can’t. Well it can. How did the software check out?”

“I don’t know, the software isn’t in scope for the safety analysis.”

Susan realised she was slowly counting to ten. “Well I’m making it in scope now. I took a couple CS classes at school, and I know they’re using the same language I learnt. I’ll probably not find anything, but I can at least take a look before we involve anyone else.”


Hours later, and Susan’s head hurt. She wasn’t sure whether it was the hack-and-hope code she was reading or the vat of coffee she had drunk while she was doing it, but it hurt. So far she had found that the robot’s one arm was apparently thought of as a specific type of limb, itself a particular appendage, in its turn a type of protuberance. She wasn’t sure what other protuberances the programmers had in mind, but she did know the arm software looked OK.

So did the movement software. It had clearly been built quickly, in a slapdash way, and she’d noted down all sorts of problems as she read. But nothing major. Nothing that said “kill the person in front of you,” rather than “switch on the wheel motors”.

She didn’t really expect to see that, anyway. The robot’s ethics module, the One Law that Roger had quoted at her, was meant to override all the robot’s other functions. Where was that code, anyway? She hadn’t seen it in her study, and now couldn’t find a file called ethics, laws or anything similar. Were the programmers over-abstracting again?, she thought. A law is a rule, no rules file.

Susan finally cursed the programmer mind as she found a source file called jude. Of course. But it definitely was what she was looking for: here was the moral code built into their first and, assuming USR wasn’t shut down, all subsequent robots. Opening it, she saw a comment on the first two lines.

// "The robot may not harm a human being."
// Of course, we know that words like MUST, SHOULD and MAY are to be interpreted in accordance with RFC2119 ;-)

The bloody idiots, she thought. Typical programmers, deliberately misinterpreting a clear statement because they think it’s funny. Poor Pal had not been taught good from bad. Susan realised that she had used his name for the first time. She was beginning to empathise more with the robot than she did with the people who built him. Without making any changes, she closed her editor and phoned Roger.

“Meadows? Oh, Susan, did you find out what’s up?”

“Yes, I looked into the software. You can send all the programmers you want in there with Pal now.”

Fiction: The Ouroborus School pt 1

On a warm Spring day, the camera follows a butterfly as it wends its coruscating way around a buddleia bush. The plant is growing in the well-kept border surrounding an immaculately manicured lawn, the quadrangle of an imposing Jacobean building. Now the perspective shifts, you zoom in over the bright primary colours of the butterfly’s wings, dancing on the thermals and eddies of the sunlit quadrangle, until you can see through the eyes of the butterfly. This is actually quite easy to arrange, as the butterfly is a Milliard Maps autonomous photography drone and its eyes are internet-connected fisheye camera lenses. As you see the slate roof slowly getting larger due to the butterfly stochastically flapping towards it, the last few words of a university lecturer addressing his students can be heard drifting from an open window.

“So even though the applications were all protected and the kernel was protected, the criminals still found ways to circumvent the system and get their malicious softs to run. Next week, we’ll look at how the firms itself was enhanced to keep customers safe and increase value. You should all read Chapter 7 before then. Don’t forget that the collections on digirights will be coming up the following week. OK! That’s it for today.”

A “collection”, we have time to reflect upon as the students gather their things and leave the lecture hall, is a mid-term examination at the Ouroborus Inc. School for Higher Education, in the centre of Oxford. It is that institution’s window, under its clinker-like roof, that leaked the words from that professorial denouement. The university isn’t really run by Ouroborus, of course. That company is actually a large manufacturer of consumer electronics over on the far side of the United States of America, and managing an educational institution is not in its core strengths. The company merely owns the building, approves lecturers into the school’s “team”, advertises the courses in its online music store (an unfortunate quirk of history that has never been corrected), and processes the registration of new students. The school is self-governed, and keeps 70% of the tuition fees. In return for its share of the fees, Ouroboros Inc. reviews the syllabus prepared for each course and rejects suggestions by the tutors that would not be in the best interests of the students. These days the curricula tend to get through with only minor tweaks, but back in 2018 the course on Experiences of Computer Softs was rejected for including a lecture on the uses of devices made by the Milliard Internet Searching Company.

The lecture you just caught the end of is from “Introduction to Digirights”, one of the compulsory courses for first year app sci undergraduates at OISHE. App sci is one of the school’s most widely-renowned degrees. Many of its graduates have left five star reviews on the Diploma Store, and it frequently tops the highest-grossing list in its category. Hiring a softsmith from OSHE’s app sci graduate pool, it is said, is a sure-fire way to build a company whose softs can command five dollars-some fetch even more-per sale. As a result they receive myriad applications every year per place, many of which are actually in localised classes presented in other countries. One such localisation is offered in the Ouroborus Institute of Technology in Cambridge, Massachusetts, United States of America. Since the federal government outlawed the teaching of American citizens in British English the school has found it more cost-effective to let OIT lecturers deliver its classes to American students.

Having contemplated the centuries-old tiles and cupolas of OSHE’s roof, your gaze is directed toward the last pair of students to leave the digirights lecture and enter the quadrangle. Mary Wilkes looks with some concern at her wristwatch. The Ouroboros OWatch (“You’ll love it for its timelessness. And for its time.”) can in fact tell you how far into the day you’ve progressed, but Mary, like many of its users has other designs on it. Currently it’s showing her that she received three public messages during the lecture, that her digirights notes have synced successfully, and that she has another lecture in fifteen minutes over in the Tutte Building. The OWatch was originally designed to help Ouroboros Inc. sell into Latin America. The theory was that the kind of people who were flashy enough to buy their phones were also sensible enough not the wield them on the streets of Sao Paulo, and therefore were not actually going to buy the phones. The watch let them show off their allegiance to the marque without as much chance of being mugged. In fact the OWatch quickly gained customers in Europe and North America, and a few more people reported to hospitals in South America with wrist lacerations.

“Sorry, Ivan, I’m going to have to skip that coffee. I’ve got to get over to King Tutte, and I’ll need to get another pack of stylus tips before that in the MISCStore. How about you send me an invite for another time?”

Ivan shakes his head, an action that sets his messy tangle of blond hair and much of the rest of his wiry frame oscillating in sympathetic motion. A playful grin erupts over his face. “No way, Mair, I’m not eventing you. Every time you’re evented you reply ‘maybe’ to the invite, and you never turn up! I’ll see you walking past a ‘Bucks sometime, and I’ll check your profile to see you’ve got nothing else on. Then you’ll have to let me buy you a coffee.” He pronounces her name ‘Mair’, but spells it ‘Mar’: saving a crucial character for those microblog posts.

“You’ll see that I have nothing else on…’maybe'” retorted Mary. The two of them laugh as they turn in opposite directions out of the OISHE main gate, Mary turning left toward the high street and the Tutte Building. Cars automatically slow down as she unconcernedly approaches the road, their computerised drivers detecting her position and speed and extrapolating that she probably means to cross seconds before she actually steps out into their path.

As the OSHE is not actually run by Ouroborus, neither is the MISCStore actually run by Milliard Internet Searching Company. “MISCStore” is the nickname given by students to the Williams Mart branch nearest to the school, whose systems are integrated with those of the search giant. Mary’s OWatch vibrates as she enters the store. “You have checked in at Will. Mart!” proclaims a notification. She selects a box of stylus tips, and takes them over to the cashier.

“Hi Mary,” the cashier says. While she has shopped at the MISCStore many times, Mary doesn’t recognise this particular employee either from the shop or school (judging by her age, and the fact that she can afford to live in Oxford, Mary guesses that the woman behind the counter is probably a fellow student). She must have read the name from her cash register, which would have received the same notification as Mary’s watch. The register will also be displaying pertinent facts from Mary’s profile; just the useful stuff like her credit rating and recent spending habits. “Just the styluses?”

“Just the styli,” Mary emphasises the last word as one of her few chances to show off her latlang A-level. Latlang is not a popular course and frequently gets one-star reviews from former students who felt that it was too hard and should, as a result, be free. Despite this, a hardcore of students (typically those who did quite well in the exams) have formed a bit of a latlang cult, with correct declension being the password between members. The cashier in MISCStore is, evidently, not among their number as the correction goes entirely without remark.

“Are you sure? It looks like you enjoy Winnesota brand hot chocolate, and we’ve got five percent off kilo packs at the moment. We think you’re really gonna love it.”

“Uh, no thanks. Just the styli, please.”

“Oh,” the cashier looked put out. “Should I go ahead and put that you don’t like Winnesota brand hot chocolate on your profile?”

“No, really, just no, it’s fine. I like the stuff, but I only bought it the once. I don’t really want any right now, it’s a bit pricey.”

“Not with five percent off it isn’t! We think it’d be the best thing to compliment your purchase of replacement touchscreen styluses.” Whether it was through a gargantuan effort of will or not seeing any issues with that sentence, the cashier kept a straight face throughout delivery.

“No, I don’t want your hot bloody chocolate! Look, I’m running late, I just need to buy these. O Watch,” this last being directed at her watch, triggering its speech command feature, “post that I’ll be late to the softscraft class. Can anyone take some notes for me?” She looked up again at the cashier. “Look, just the styli, they’re one ninety-nine right? Here’s my Millicard.”

“Thankyou for shopping at Willims Mart. Will you ‘Like’ this transaction on your profile? Liking is a great way to keep up with all the buzz and special deals on offer at Williams Mart!”

“Fuck off will I fucking like this fucking transaction! You’ve made me late with all this crap about chocolate and now you want to know if I like you? Piss off!” Having exhausted her daily allowance of expletives Mary turns to leave the store. She notices for the first time that a short bald man, a sort of overweight weasel that got its wish to be a real boy, is standing by a door behind the counter. As he isn’t wearing the cheap polyester official Williams/Milliard shirt he must be the manager.

“Thank you for shopping with us, miss. Company policy requires that after insulting our transaction enabler we must ask you to apologise on your public profile and to ‘Like’ Williams Mart.” Mary is storming out of the store and only barely hears the last part, “it is a federal crime to disregard compliance with any Williams Mart policy.”

Another vibration from her watch. “You have checked out.”