The Probability ProblemBy Regina Kwon | Posted 2001-12-10 Email Print
Modernizing Authentication — What It Takes to Transform Secure Access
It's Monday morning, and the chief operating officer calls you in for advice.
Can your developers build a standalone application within six weeks, the latest date that would make the project worthwhile?
Numerous cognitive research studies have found that people are consistently overconfident in their judgment by about 20%putting the likelihood you'll deliver that finance application within six weeks at about 70%. And if you're a manager, your estimate is probably more than twice as generous: In a study of 1,000 American and European managers, business school professors J. Edward Russo and Paul J. H. Schoemaker found that group to be correct only 30% to 60% of the time.
Most risk models rely on estimates made by subject-matter experts, says Douglas Hubbard, the inventor of Applied Information Economics, a methodology for calculating return on IT projects. Overconfidence can cost a company large sums, as well as incur bad blood between business and information technology. To correct for this, Hubbard starts each engagement with a workshop in which he "calibrates" participants to appropriate confidence levels.
A typical calibration workshop consists of a half-day of general-knowledge questions, in which participants provide answers or estimates along with their confidence that the answers they gave are correct.
"After a few sessions, people get a sense of what a 90% confidence level actually feels like," says Hubbard, who recently conducted workshops for research firm Giga Information Group and consulting company Booz Allen Hamilton. "They stop using 90% so much, but when they do, they're usually right."
For basics on calibration, read Russo and Schoemaker's Decision Traps. For more about AIE and calibration workshops, visit www.hubbardresearch.com.