The First Flaw

As she left her desk at the grandiosely-named United States Robotics, Susan reflected on her relationship with the engineering team she was about to meet. Many of its members were juvenile and frivolous in her opinion, and she refused to play along with any of their jokes.

Even the title they gave her was mocking. They called her “the robopsychologist,” a term with no real meaning as USR had yet to make a single product. They had not even sold any customers on the promise of a robot. All they had to their name was a rented office, some venture capital and their founder’s secret recipe that was supposed to produce an intelligent sponge from a mixture of platinum and iridium.

While Susan might not be the robopsychologist, she certainly was a psychologist, of sorts. She seemed to spend most of her time working out what was wrong with the people making the robots, and how to get them to quit goofing off and start making this company some much needed profit. Steeling herself for whatever chaotic episode this dysfunctional group was going through, she opened the door to the meeting room. She was waved to a seat by director of product development Roger Meadows.

“Thanks for coming, Doctor Ca-“

Susan cut him off. “You’ve called me Susan before, Roger, you can do it again now. What’s up?”

“It’s Pal. We’ve lost another programmer.”

Susan refused to call their unborn (and soon stillborn, if the engineers didn’t buck up soon) product by its nickname, short for Proprietary Artificial Lifeform. She had at least headed off a scheme to call it “Robot 2 Delegate 2”, which would have cost their entire budget before they even started.

“Well, look. I know it’s hard, but those kids work too much and burn themselves out. Of course they’re going to quit if-“

“No, I don’t mean that. Like I said, it’s Pal. He killed Tanya.”

“Killed?” Susan suddenly realised how pale Roger looked, and that she had probably just gone a similar hue. “But how, no, wait. You said another programmer?”

“Er, yes. I mean, first Pal got Steve, but we though, you know, that we could, uh, keep that quiet until the next funding round, so-“

The blood suddenly came back to Susan’s face. “Are you telling me,” she snapped, “that two people had to die before you thought to ask for any help? Did you come to me now because you’re concerned, or because you’ve run out of programmers?”

“Well, you know, I’d love to recruit more, but as they say, adding people to a late project…”

Yes, thought Susan, I do know what they say. You can boil the whole programming field down to damned aphorisms like that one. Probably they just give you a little phrasebook in CS101 and test you on it after three years, see if you have them all down pat.

“But what about the ethics code? Isn’t there some module in that positronic brain you’ve built to stop that sort of thing happening?”

“Of course, the One Law of Robotics. The robot may not harm a human being. That was the first story we built. We haven’t added the inaction thing the VCs wanted, but that can’t be it. That mess in the lab was hardly the result of inaction.”

“Right, the lab. I suppose I’d better go down and see for myself.”


She quite quickly wished she hadn’t. Despite having a strong constitution, Susan’s stomach turned at the sight of barely-recognisable pieces of former colleague. Or possibly colleagues, she wasn’t convinced Roger would have let a cleaner in between accidents.

The robot had evidently launched itself directly at and subsequently through Tanya, stopping only when the umbilical connecting it to the workstation had become disconnected, removing its power source. Outwardly and, Susan knew, internally, it lay dormant in its new macabre gloss coat.

“I take it you did think to do a failure analysis? Do you know what happened here?”

“If I knew that, Susan, I wouldn’t have gotten you involved.” She believed it, knowing her reputation at USR. “We’ve checked the failure tree and no component could cause the defect mode seen here.”

“Defect mode! Someone’s dead, Roger! Two people! People you’re responsible for! Look, it went wrong somehow, and you’re saying it can’t. Well it can. How did the software check out?”

“I don’t know, the software isn’t in scope for the safety analysis.”

Susan realised she was slowly counting to ten. “Well I’m making it in scope now. I took a couple CS classes at school, and I know they’re using the same language I learnt. I’ll probably not find anything, but I can at least take a look before we involve anyone else.”


Hours later, and Susan’s head hurt. She wasn’t sure whether it was the hack-and-hope code she was reading or the vat of coffee she had drunk while she was doing it, but it hurt. So far she had found that the robot’s one arm was apparently thought of as a specific type of limb, itself a particular appendage, in its turn a type of protuberance. She wasn’t sure what other protuberances the programmers had in mind, but she did know the arm software looked OK.

So did the movement software. It had clearly been built quickly, in a slapdash way, and she’d noted down all sorts of problems as she read. But nothing major. Nothing that said “kill the person in front of you,” rather than “switch on the wheel motors”.

She didn’t really expect to see that, anyway. The robot’s ethics module, the One Law that Roger had quoted at her, was meant to override all the robot’s other functions. Where was that code, anyway? She hadn’t seen it in her study, and now couldn’t find a file called ethics, laws or anything similar. Were the programmers over-abstracting again?, she thought. A law is a rule, no rules file.

Susan finally cursed the programmer mind as she found a source file called jude. Of course. But it definitely was what she was looking for: here was the moral code built into their first and, assuming USR wasn’t shut down, all subsequent robots. Opening it, she saw a comment on the first two lines.

// "The robot may not harm a human being."
// Of course, we know that words like MUST, SHOULD and MAY are to be interpreted in accordance with RFC2119 ;-)

The bloody idiots, she thought. Typical programmers, deliberately misinterpreting a clear statement because they think it’s funny. Poor Pal had not been taught good from bad. Susan realised that she had used his name for the first time. She was beginning to empathise more with the robot than she did with the people who built him. Without making any changes, she closed her editor and phoned Roger.

“Meadows? Oh, Susan, did you find out what’s up?”

“Yes, I looked into the software. You can send all the programmers you want in there with Pal now.”

About Graham

I make it faster and easier for you to create high-quality code.
This entry was posted in Fiction, Responsibility, social-science. Bookmark the permalink.