Why Design is Important (re: Hawaii Missile Scare)
by Alex Tillard
A little over a week ago, it was a normal Saturday morning in Hawaii. Until about 8 a.m. At that time, an emergency alert was sent to all cell phones: BALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL.
People were panicking, climbing into their bathtubs, hiding their children in manholes, and praying that they would somehow live through this.
But as it turned out, it was supposed to be a drill.
There was no missile threat, and the message was corrected 38 minutes later (way too long – but that’s not the subject of this post).
Naturally, after something this big and potentially catastrophic, everyone asked, “How did this happen?” And not surprisingly, the blame was placed on the officer who had clicked the wrong button. This person was “reassigned” as a punishment, or consequence or whatever (although as Stephen Colbert said on the Late Show, “No! You keep him in that job! He is the one person on the planet who will never, ever make that mistake again!”).
But did anyone in the government ask why their employee made this mistake? Some in the news did, and I’m going to share some of that perspective as well as my own take on what happened.
Essentially, the problem and the solution for this would-be tragedy are both found in—you guessed it—design. For instance, here’s a quick glance of what the engineer’s menu looked like at the time of the event:
Here’s what the Washington Post said about it:
“The menu, which triggers alerts, contains a jumble of options, ranging from Amber alerts to Tsunami warnings to road closures. Some of them, such as ‘High Surf Warning North Shores,’ are in plain English. Others, including the one for a missile attack, ‘PACOM (CDW)-STATE ONLY,’ use shorthand initials.”
Yep, the menu is ugly, looks like it was made in the early 1990s when code and design were somewhat new and limited, and was never updated after that. So, as much as we want to blame human error, who WOULDN’T possibly click the wrong “button” from this list (especially when it’s not even a button, but just an underlined link)?
That brings us to the real issue here: function over form. I’m not saying that form over function would be correct, either. Actually, I’d argue that both of these things are equally important. I am a graphic designer, but I also work with a lot of content creation and organization. And (I hope) that gives me a pretty good perspective on how both things need to work together. In my experience, it’s very common for software to ignore design. As long as it works, who cares what it looks like, right? WRONG.
Ever heard of UI or UX? Those letters stand for User Interface and User Experience and are usually attributed to web and mobile applications. But these fields exist because how a user experiences an app or software matters as much as the product itself. If it’s confusing either functionally or visually, it just doesn’t work as well as a well-organized, well-designed product. All the right pieces are there in the PACOM software, but it’s confusing both functionally and visually.
So let’s talk about content first. In this dropdown menu, none of the choices are worded consistently or are in any kind of clear order. From a content perspective (we’ll get to design in a minute), here’s how I’d have organized and written it:
Note: CAE stands for Child Abduction Emergency, CEM stands for Civil Emergency Message, CDW stands for Civil Danger Warning, PACOM stands for the United States Pacific Command based in Hawaii.
First, start with the type of alert (CAE, CDW, etc.), which now are listed alphabetically based on type of alert, and then by the detail after the type. This should make the type of alert easier to find from the list. Then you’d click into each one, and then choose between Test Alert, Live Alert, or False Alarm (there’s an extra step for the Amber Alert). Test is always first, so if you choose that by mistake, it’s not quite as big of a deal (easier to send a real alert afterwards than to send a real alert first, and then have to send out a false alarm). False alarm is at the bottom, because that should be the least used (with an intelligent human and a well-designed system, you should never need it).
Now that the content is organized, let’s add some design elements. First, let’s add some font size/weight differences and spaces.
Now let’s add actual buttons and colors. Color is what our eyes see first, so adding colors and training employees for what each color means will only help with clarity and bring the probability of mistakes down. For the main menus, I made Amber Alerts orange (similar to the color of amber), danger messages red (the most serious alert) and weather-related messages blue (like sky, water, rain, etc.) For the type of alert under the main menus, I used traffic light colors because in the United States, those are pretty universally known: red for a real warning (stop what you’re doing and take cover!), yellow for a test (slow down and be prepared), and green for a false alarm (everything is ok!).
Now there is another piece of this puzzle, which is the pop-up confirmation screen. This was supposed to be one more line of defense against sending out a wrong message. I’m not sure what this screen looked like, but it could have said something like, “You are about to send a REAL alert. Are you sure you want to proceed?” or “Please confirm.” It may have even shown the actual alert, but it was obviously too easy to click past (like we typically do with confirmation messages because we just get so many of them).
There are, of course, all sorts of other ways we could improve how this software functions and how the screens look (not to mention creating better safeguards to make it harder to click past the confirmation screen). But for just this menu alone, a panic-inducing false alert message could have been avoided with better design, a little bit of organization, and some clearer writing.
The moral of this story is: not form over function or function over form, but form AND function, working together. Crisis averted.