<img alt="dcsimg" id="dcsimg" width="1" height="1" src="//www.qsstats.com/dcs8krshw00000cpvecvkz0uc_4g4q/njs.gif?dcsuri=/index.php/c/a/Projects-Processes/Gotcha-False-Alarms&amp;WT.js=No&amp;WT.tv=10.4.1&amp;dcssip=www.baselinemag.com&amp;WT.qs_dlk=XYkg4@62bDQQjt3gMX0-LwAAAAg&amp;">

Gotcha! False Alarms

By David F. Carr  |  Posted 2004-08-01 Print this article Print

Don't be alarmed by false alerts.

Automated systems have a tendency to fill the world with alerts: fire alarms, low-inventory warnings, squawking medical devices in a hospital, and collision alarms in planes or, increasingly, automobiles. Whether you're designing the alarm-generating software or merely deciding which alarms to turn on in a packaged product, the challenge is finding the right balance between too many alarms and too few.

Problem: Too many false alarms train users to ignore valid warnings.

Resolution: You must combat a "better safe than sorry" attitude among systems engineers, says James P. Bliss, a professor at Old Dominion University who has studied this "Boy Who Cried Wolf" problem extensively in the context of aviation, air traffic control and vehicle warning systems. "We all have a limited resource called attention," Bliss says. "When a series of false alarms are generated by a system, it becomes less attention-worthy because of its history of unreliability."

False alarms can often be reduced when systems are designed to respond to multiple warning signs before they sound (for example, a smoke alarm that detects smoke, light and heat, rather than just smoke). Another way to improve user response is to provide more information about why the alarm sounded, making it easier to distinguish critical alarms from minor or false alarms, Bliss says.

Problem: Everyday applications allow themselves to become overwhelmed with alerts.

Resolution: Identify where internal applications have gone overboard and sacrificed usability.

Excessive pop-up alerts and interruptions such as legal disclaimers used to be a problem with consumer Web sites, says Bruce "Tog" Tognazzini, a computer-human interaction expert and a principal of the Nielsen Norman Group. He notes that Web retailers who didn't learn to streamline their sites went out of business. But these problems are still prevalent in many business applications for a company's internal users, he says: "The feedback loop is not there the way it is in retail."

Problem: Consulting company lawyers leads to more alerts.

Resolution: Appeal to a higher authority, even the CEO, to weed out lawyer-mandated alerts.

Lawyers tend to err on the side of too many alerts, Tognazzini says. Sometimes the lawyers have legitimate concerns about, for example, the liability of failing to warn users about the limitations of a system's security, reliability or accuracy. But do users need to click to acknowledge this warning every time they log on, or only the first time?

If you have to appeal to a greater authority, Tognazzini suggests translating the impact into dollars. "Executives tend to care about money," he says. For example, time wasted on excessive alarms can be translated into lost productivity and, ultimately, into dollars.

Problem: Critical systems often tolerate a high ratio of false warnings to legitimate threats.

Resolution: Even "unreliable" systems can be useful, according to Mark St. John, a human factors expert at Pacific Design and Engineering Group of San Diego. St. John says that the air warfare systems he is designing for the U.S. Navy detect a large number of potential threats that turn out to be false alarms. But rather than sound blaring alarms, the system color-codes targets on an operator's radar screen based on the probability that they may be real threats. "The system is telling you, 'I'm just dumb automation, but I can tell you when things are likely,'" St. John says.

David F. Carr David F. Carr is the Technology Editor for Baseline Magazine, a Ziff Davis publication focused on information technology and its management, with an emphasis on measurable, bottom-line results. He wrote two of Baseline's cover stories focused on the role of technology in disaster recovery, one focused on the response to the tsunami in Indonesia and another on the City of New Orleans after Hurricane Katrina.David has been the author or co-author of many Baseline Case Dissections on corporate technology successes and failures (such as the role of Kmart's inept supply chain implementation in its decline versus Wal-Mart or the successful use of technology to create new market opportunities for office furniture maker Herman Miller). He has also written about the FAA's halting attempts to modernize air traffic control, and in 2003 he traveled to Sierra Leone and Liberia to report on the role of technology in United Nations peacekeeping.David joined Baseline prior to the launch of the magazine in 2001 and helped define popular elements of the magazine such as Gotcha!, which offers cautionary tales about technology pitfalls and how to avoid them.
eWeek eWeek

Have the latest technology news and resources emailed to you everyday.