On 9 February human error resulted in 11 deaths and 80 further casualties in tragic wreck near Bad Aibling, Germany. Despite a supposedly robust safety system, a lone operator was capable of affecting great tragedy. Does this sound suspicious? The phrase “user error” is legal misdirection rather than an honest attempt to find a cause.
People are to blame sometimes, but saying “user error” is a simplification. It fails to capture the circumstances of the situation and intent of the user. There’s a huge difference between somebody forgetting something and somebody actively working around technology. In this article I split user error into a few key classes.
Passive omission
Any technical process, from creating a presentation, to preparing an aeroplane for take-off involves a number of steps. Generally it’s a person who drives these steps, often with technical assistance. The technical assistance can range from a simple checklist, to an automated quality system. Ultimately though it’s a person who is responsible for performing the steps — even in highly automated manufacturing domains people still fulfil links in the chain.
Now think of how often you’ve forgotten to do something? Not just at work, but in your life as a whole. It’s not an uncommon occurrence. We forget things all the time. We even forget significant things. This is a large reason why we use technology: to help use not forget the important steps. It’s never perfect though.
Omission is a passive class of human error. Unless something points our attention to it, we generally aren’t even aware that we’ve forgotten something. There is no way to avoid this limitation: no amount of training will stop us from being forgetful. Technology either has to automate the steps or find a way to check they were performed. Simply assuming a step has been done will inevitably lead to system failure.
Forced delegation
Automated systems have their limits; they cannot always decide what is the best course of action. People are involved at these times to make a decision for the computer. This could be a simple popup prompt asking to save a document, or a warning siren at a nuclear power plant. In both cases the technology has detected something wrong and is requesting the user do something about it.
Bad decisions play a major role in the life of all people. We spend our days making them and sit around in the evening lamenting over them. The moment a computer asks me to make a decision it will have to deal with me making the wrong decision.
A decision can only be as good as the information one has to make it. It’s not just a matter of the bulk of the information, but the quality, and how it is presented. The user needs a way to access details relevant to the decision, and on a timeline fast enough to resolve the situation. This is a hard design problem that nobody has solved in any domain.
This class of error is where training does actually help. A well trained user can often make up for a lack in the technology, being able to better use the information presented to them, or knowing how to find more relevant information. Experience also makes people more comfortable with the decisions and less stressed by the time pressure.
Active interruption
Sometimes users go out of their way to interfere with the normal operations of a program. They perform actions knowing full-well it’s not intended. This happens because automated processes, and checklists, are simply never enough to cover all situations that will arise. The technology can also fail, requiring human intervention to get the system back in an operating state.
It’s rare that malicious intent is the driver of this class. Users are generally trying to resolve an issue they have, they aren’t just haplessly pressing buttons because they are bored. It’s important for designers to review these situations and understand what lead to the disruptive behaviour.
This is closely related to forced delegation. In both cases the users is making a decision, but in this situation the user is doing it of their own volitions, without prompting from the technology.
Lack of knowledge
What if the user just doesn’t know how to do something? Let’s assume first of all that they are qualified in the domain, but lack some specific knowledge about how the technology works. The more features a piece of technology has the more likely it is that a user won’t know how to do something.
This is more of a secondary class than a main one. It is perhaps the reason why the user failed to do something in the first class. It is perhaps what caused them to make an inferior decision in the second class, as they were unable to perform their first decision. It may lead to frustration causing them to try random things in the program, the third class.
It’s tempting to blame a user here, but it’s a matter of degree. Some programs simply have too many options and can’t be learned entirely. It’s the domain of UI design to create a system that people can use without having to remember everything. It also isn’t a solved domain, indeed most systems have interfaces that are very far from ideal.
Blame the user!
Those classes are why I consider statements of “user error” to be malarkey. The phrase too easily attributes fault to a person when it could easily be the technology at fault. Each class represents a different situation, and involves a distinct user intent.
Back to the train wreck, it’s clear that the train operator played a role in the crash. From the limited information made available it sounds like he made a bad decision and gave a wrong signal. But why was he even forced to make this decision? Did he have quality information to make it with? He even recognized the wrong decision and attempted to correct it, but failed to do so. Why?
We can’t just blame users when user experience experts still disagree on how to create interfaces. We can’t give the user bad information and then expect them to make a good decision. We have to expect that if something is going wrong the user will actively attempt to fix it, and provide safe ways to do so.
Sure, often the user is truly to blame, but it is never an excuse to not fix the technology.