Over the past week we have been subsumed by the intense, final work phase just before the deadline of a big, complex report. The profanity-density has been high, mostly aimed at Google, Microsoft and Apple. Not all of it was deserved, but it brought home the issue that designing software carries moral implications.
Time is a non-renewable resource.
Larry Kenyon was the engineer working on the disk driver and file system. Steve came into his cubicle and started to exhort him. “The Macintosh boots too slowly. You’ve got to make it faster!”
Larry started to explain about some of the places where he thought that he could improve things, but Steve wasn’t interested. He continued, “You know, I’ve been thinking about it. How many people are going to be using the Macintosh? A million? No, more than that. In a few years, I bet five million people will be booting up their Macintoshes at least once a day.”
“Well, let’s say you can shave 10 seconds off of the boot time. Multiply that by five million users and that’s 50 million seconds, every single day. Over a year, that’s probably dozens of lifetimes. So if you make it boot ten seconds faster, you’ve saved a dozen lives. That’s really worth it, don’t you think?” —Andy Hertzfeld
Today there are more than three billion internet users.
If each loses one second per day due to slow software, that is 95 years per day. A lifetime every day: were somebody going around killing a person every day, year in and out, they would be the greatest serial killer in history. Even somebody who each day merely made somebody intensely annoyed for her entire life would be regarded as magnificently evil.
Does the fact that this loss is distributed make it any less? Maybe. After all, stealing a little bit of food from a lot of people seems less bad, arguably, than stealing one person’s entire dinner. Furthermore, in this case, people might find some substitute activity to occupy those seconds so that the time isn’t a total waste. But even if the distribution between and across lives means we shouldn’t simply sum up the harm, and even if the time isn’t entirely wasted, with such astronomical numbers, the time-loss remains highly morally significant.
One could argue that better computing, while wasting some time, provides us with far more valuable things. The old Macintosh enabled much creativity and productivity for the users, and it might even have been more pleasant to use for many than standard computers. Surely those positive hours per day would outweigh the loss of some seconds. But in many cases the loss would have been avoidable: if engineers had only worked a bit harder, those seconds would have disappeared. Not to mention some of the crashes, costing far more valuable time and blood pressure.
Designers and engineers of widely used consumer products have far more impact in the world than they might think. They see a good enough product that a few focus group users try, not the millions that use it every day.
This kind of distributed loss may be easy to shrug off, but design can also cause direct, drastic disasters. Both the Challenger and Columbia shuttle disasters have been laid at the door of PowerPoint
The problem is that the bulleted list style PowerPoint favors is not good for communicating technical information or complex interdependencies. It makes it easy to overlook doubts and uncertainty when there are reassuring messages in large font size. It is a tool for presentation, not discussion.
The US Army has found itself fighting presentation bureaucracy as an internal enemy.
“It’s dangerous because it can create the illusion of understanding and the illusion of control. Some problems in the world are not bullet-izable.”
“No one is suggesting that PowerPoint is to blame for mistakes in the current wars, but the program did become notorious during the prelude to the invasion of Iraq. As recounted in the book “Fiasco” by Thomas E. Ricks (Penguin Press, 2006), Lt. Gen. David D. McKiernan, who led the allied ground forces in the 2003 invasion of Iraq, grew frustrated when he could not get Gen. Tommy R. Franks, the commander at the time of American forces in the Persian Gulf region, to issue orders that stated explicitly how he wanted the invasion conducted, and why. Instead, General Franks just passed on to General McKiernan the vague PowerPoint slides that he had already shown to Donald H. Rumsfeld, the defense secretary at the time.” —Elisabeth Bumiller
But although this has been recognized since the early 2000s, the problem remains.
Enemy contacts in Afghanistan and Iraq would go unreported because they required a PowerPoint description after the fact, something some officers felt “was useless … they didn’t want to go through the hassle.” —Kevin Lilley
Spreadsheet programs are another source of disasters and distortions. Spreadsheets are widespread tools in business and elsewhere, but have an astonishingly high rate of errors. According to studies close to 90% of all spreadsheets have errors—many with significant effects on business. Seventeen percent of large UK businesses have suffered financial loss due to poor spreadsheets, and far more (57%) have wasted time or made poor decisions (33%) due to spreadsheet problems. The list of horror stories is long and expensive.
Again, used well, spreadsheets are fine. But the evidence is fairly clear that this is rare. Careful planning, systematic inspection, verification, documentation, training, and the right reporting policies have all been shown to reduce the risks—but who does that spontaneously? The grid sits there, inviting you to fill it.
A defender may argue that presentation software and spreadsheets are mere tools, and it is up to their users to use them wisely. But a tool put in the hands of many can be more or less risky.
Also, tools have affordances: by their shape or capabilities they invite certain actions. Default choices act as powerful nudges. Actions that are easy to do will be done more often than the difficult tasks (and soon will become easier since they are trained). The design of the tool can make us want to use it in certain ways.
How responsible are the software people who made the presentation and spreadsheet software for these pervasive, systemic problems? Unlike the waste of time, here the bad consequences emerge from the interactions of software, how individuals are using it, and how it gets co-opted by bureaucratic pathologies. One might suppose it is beyond anyone to predict the full ramifications: when Dan Bricklin and Bob Frankston wrote VisiCalc in 1979 they could hardly imagine how its descendant would lead to multi-billion dollar losses for JP Morgan.
Yet the enormity of the effects suggests that designers need to be more careful if they can. Modifying affordances to get rid of predictable stupidities seems to be doable and moral.
This phenomenon of wasted decades and lives attributable to bad software is an instance of moral blindness for most of us. Software engineers trying to reduce boot-up speed and design PowerPoint are, presumably, not generally aware of just how morally important their actions are. Instead, it’s easy for them (and us) to think that what they do is worthwhile and useful, but not equivalent to a matter of life-and-death. Surely most software engineers just don’t feel the same way about their code as doctors do about their decisions.
Ethicists would do well to investigate other potential similar kinds of moral blindness—extremely morally significant areas that we just tend not to notice are in fact morally significant. In particular, we might focus on costs that are small individually but scale massively. Examples could include: reducing commute-time slightly for tens of thousands of drivers, discovering better drugs to treat mild headaches, or marginally improving a very popular video game. These shouldn’t be the top ethical priorities in the world, of course, but they’re a lot more important than they seem at first glance.