Bad Software Can “Enronize” Anyone

“Kenny Boy” Lay and his cohorts at Enron sold the world on a badly flawed company, whose innards were twisted beyond reclamation, but were beyond their responsibility. Or so they would claim.

So, in the extreme, can it be with companies producing defective computer software.

What is the difference between peddling a stock or a product as having value, when in reality, insiders know what is being sold is unreliable, unsound and unworthy?

There is no ultimate arbiter of the quality of software, except for the experience of the buyer. But the National Infrastructure Protection Center keeps a list of “high risk” software where there is “no workaround or patch” for an identified flaw. On the list are products from little-known companies such as Avirt, Ball Crew, Black Moon, CDRDAO, Chinput, Clanlib and Enlightenment. Also appearing on the list are Microsoft (nine times), Oracle (four times) and Cisco Systems. And this is a partial list from one week, Jan. 28, 2002.

This doesn’t mean any of these companies are rotten, as Enron proved to be. But the risks have to be removed, if corporations are to buy and use software with the kind of reliability that will allow them to concentrate on the valid mission of “creative destruction” of rivals.

To its credit (although seemingly late in the game), Microsoft is working with a quartet of computer security firms to develop an appropriate policy for providing “full disclosure” to potential corporate buyers, regarding flaws. It is then allowing a limited process closed to the outside to come up with a method for dealing with the flaws.

By disclosing flaws to a closed community, the hope is that the risks to that community will be reduced. Only those who need to know about flaws will know about them. The reasoning behind this is that full public disclosure can have the adverse effect of “providing blueprints for building (software) weapons [i.e. viruses, worms, etc.].”

The problem here is that limited release of information about flaws may not actually protect companies from those “weapons.” Systems managers—who don’t have the ken or interest in finding holes in products they just want to use—aren’t likely to be a really effective pool of product testers, when it comes to security flaws. They aren’t practiced hackers.

Chicago computer security consultant William Cummins, for instance, believes corporations need to tap into the hacker underground more, rather than less. He contends the flaws and fixes that do get publicized are old. The useful ones are kept private among hackers. Nonetheless, he says, there are literally thousands of exploits floating around in the “wild,” discussed and exchanged, without security consultants, federal agents, or information systems staff having the slightest clue.

One solution: Put someone on your information technology staff that has a clue about what is happening in the computer programming underground so your company is not just listening to the feds who generically warn that hackers are causing the computer-sky to fall.

Second solution: Assign personnel on your staff to constantly attempt to uncover flaws and expose holes in programs your company wants to use. Participate with other companies in joint vulnerability tests.

Third solution: Post a reward—in cash or in computing equipment—inside your company to encourage staff to find software flaws. You will improve company morale as well as the security of the business. Make the results public (after the problem is both identified and fixed). You can’t buy better publicity.

Fourth solution: If you purchase software by the hundreds or thousands of seats, insist your in-house legal talent find a “workaround” from those crippling “shrink-wrap laws” that prevent any financial redress for bugs or holes in the software you purchase.

“Full disclosure”—which will actually lead to limited public disclosure—will work only if companies attack pre-release and post-release software with as much vigor as hackers.