If It's Automated, It Must Be Accurate
Nearly a quarter-century ago, IBM defined and standardized the personal computer. More than 40 years ago, the company defined and standardized the commercial computer as well.
Dissatisfaction remains fairly high with the computing experience. Personal machines still crash too often, are subject to too many insecurities, and force users to figure out such inanities as what an F7 key does. You kind of know why the letter "F" was chosen for all those undefined keys at the top of your keyboard.
Corporate systems, for their part, crash too often, are flooded with useless data, worms, viruses and hostile requests, and suffer from too much complexity. Lots of bad data gets input; lots of bad "information" appears in output. This is the long-standing GIGO principle: Garbage in, garbage out.
Efforts to check data can be next to nonexistent, as demonstrated in last month's cover story, "Blur." And efforts to resolve incongruities in centralized systems are tedious, as I can attest. Every airline in this country chooses to spell my name differentlyand some do it more than one way, depending on how many databases they keep.
That said, it's easy to forget how much change, mostly for the better, has come in commercial computing.
Which takes us back to the introduction of the IBM 360, the machine Reader's Digest managers insisted on in this month's cover story, "The Longest Goodbye."
Here's how primitive computers were when the 360 was announced in April 1964. According to Mike Kahn, managing director of The Clipper Group in Wellesley, Mass., they were:
There wasn't even agreement on what constituted a single character of digital information. Some machines used six bits, a.k.a. ones or zeroes, to form a byte. Some used seven. IBM insisted on eight, to get the most information into a single character.
IBM had faith that processing power and storage space would only increase, making it senseless to limit characters to 64 values (2 to the power of 6) when each could represent 256 (2 to the power of 8).
Setting the standard in corporate computing was a bet-the-business proposition for IBM. It spent $5 billion on the launch of the 360. At the time, 1963, its annual revenue was $2.1 billion and its profit $290.5 million. The budget in 1963 for the National Aeronautics and Space Administration, about to send someone to the moon, was $2.5 billion.
What has not changed since then? Humans.
Programs still have bugs in them because humans make mistakes. New applications fail to take root because individualsorganizations are just societies of individualsresist change. Training is often negligible or nonexistent.
Harvard Business School professor F. Warren McFarlan contends in fact that computing is not so much about math as it is a field of "applied human psychology.''
Now, in 2005, what has really changed is the psychology of computing. Even as unreliability and insecurity persist, companies believe that the information, analysis and results that computers produce are correct. As one Philadelphia CIO puts it, "If it's automated, it must be accurate."
That is what Baseline 's technology editor, David F. Carr, calls the new GIGO principle: Garbage in, gospel out.
Not so fast. You can be in awe of how far computing has come since the advent of the IBM 360. But here's a technical standard to live by, even now: Never accept the results of computing with blind faith.