Alan Kay: Beyond The Box

By Allan Alter Print this article Print

Alan Kay believes a quantum leap in personal computing is long overdue. But it will require a whole new way of thinking.

Alan Kay is not a fan of the personal computer, though he did as much as anyone to create it. A winner of the Turing Prize, computer scientist Kay was the leader of the group that invented object-oriented programming, the graphical user interface, 3D computer graphics, and ARPANET, the predecessor of the Internet. After helping to create the Alto, the Xerox PARC PC prototype that inspired the Apple Macintosh, he took on the role of chief scientist at Atari Corp. and became a fellow at Apple Computer, Walt Disney Co. and Hewlett-Packard Co.

While most people regard the personal computer as a modern miracle, Kay sees the PC as a chronic underachiever. To him it's an invention that, like television, has fallen far short of the potential foreseen by its early proponents. Today, at age 66, Kay runs the Viewpoints Research Institute, his own nonprofit research organization in Glendale, Calif. He is busy with several projects involving education and technology, including the "One Laptop per Child" project overseen by MIT's Nicholas Negroponte, which Kay hopes will one day transform the PC into a machine that not only has changed the way we work, communicate and entertain ourselves, but improve how people—especially children—learn and think.
Kay believes the limitations of the PC are due as much to lack of imagination and curiosity on the part of computer scientists, the unwillingness of users to invest effort into using computers, and the deadening impact of popular culture, as they are to technical constraints. He says the push to make PCs easy to use has also made them less useful; their popularity has stunted their potential. Executive Editor Allan Alter spoke with Kay about the future of the PC. The following is an edited version of their discussion.

CIO Insight: Do you feel PCs and Macs have come close to reaching their potential yet?

Kay: No, I don't think so. Computers are mostly used for static media, basically text, pictures, movies, music and so forth. The Internet is used as a distribution network, so computers are essentially players for this media. This is incredibly useful, but it tends to overwhelm uses that require a much longer learning curve.

When I started in computing in the early sixties, people realized that while the computer could simulate things we understood very well, one of its greatest uses was simulating things that we didn't understand as well as we needed to. This has happened in the sciences; physicists, chemists, biologists and other scientists could not do what they've been doing if they didn't have powerful computer simulations to go beyond what classical mathematics could do. But it's the rare person who quests for knowledge and understanding.

Read the full story on CIOInsight.com: Alan Kay: Beyond The Box.

This article was originally published on 2007-02-01
Executive Editor

Allan Alter has been a specialist on information technology management, strategy and leadership for many years. Most recently, he was editor-in-chief and the director of new content development for the MIT Sloan Management Review. He has been a columnist and department editor at Computerworld, where he won three awards from the American Society of Business Press Editors. Previously he was a special projects editor, senior editor and senior writer for CIO magazine. Earlier, Alter was an associate editor for Mass High Tech. He has edited two books: The Squandered Computer: Evaluating the Business Alignment of Business Technologies and Redesigning the Firm.

eWeek eWeek

Have the latest technology news and resources emailed to you everyday.