5 to Watch as Baseline Turns Five

As this issue marks the fifth anniversary of Baseline, there’s always the temptation to look back and marvel at how far we have come since 2001. For example, we all knew five years ago that the Internet was going to change everything, but very few of us understood just how much.

But as exciting as the last five years have been, it’s pretty clear that the next five are going to be driven by continuing waves of innovation that will fundamentally alter every aspect of computing as we know it today.

So, with that in mind, let’s identify five key emerging technologies that could lead to a doubling of productivity within the next five years.

Multicore processors. At a time when most I.T. people are still wondering what to do with the dual-core processors that are starting to show up in PCs, the prospect of multicore systems may be mind-boggling. When you look at Windows Vista or the latest operating systems from Apple, it’s pretty clear that we have only begun to tap what can be done in terms of visualization on PCs. In fact, applications such as facial recognition and avatars with rich sets of personality traits are likely to become commonplace across the Web.

Proactive security. Although malware and other types of security threats now consume an inordinate amount of time and resources, the vast majority of the security issues we deal with today stem from design failures going back 10 years or more. Most of the blame for this can be laid at the feet of Intel and Microsoft.

But the industry as a whole is to be applauded for developing a range of proactive security tools and, more important, new security hardware standards. Although it will take time for these standards to be adopted across the range of hardware devices we use today, the end result should be that no piece of code will be run on any system without the express permission of the owner of that system. In addition, software vendors should have sufficiently embraced virtualization to the point where any piece of offending malware can be quickly isolated, and the machine running it automatically rolled back to its previous pristine state.

Always connected. There’s a lot of debate today over which wireless technology will ultimately dominate, with carriers favoring a variety of third-generation technologies while Intel champions 802.11, WiMax and, eventually, mesh networks. One school of thought says wireless networks will be so pervasive that 3G technologies won’t be necessary, while the carriers argue that the instability of wireless will mean that there will always be demand for their services.

Five years from now this conversation should be moot, however, because most PCs and handheld devices will automatically be able to detect which wireless service is available. And then, based on the most economically efficient connection available, the device will connect a user to a network, which means we’ll all be connected via a fat pipe anytime, anywhere.

Virtual applications. Virtual operating systems and storage may be all the rage today, but the ultimate goal is to unlock control of our data from applications, databases and file systems. What’s driving this trend is not only the need to eliminate the ability of any one software vendor to hold an I.T. group hostage, but also the need to maximize a company’s most important asset—its data.

As we look downstream following the adoption of service-oriented architecture, it’s pretty clear that the places we store data today are rapidly becoming open containers that will morph into commodities. This means the value proposition around software will shift from where data is stored to where and how it is used.

We’re seeing the early stages of this trend in the form of composite applications in the enterprise and mashups on the Web. Today, composite applications are still relatively difficult to build, but emerging development environments such as Ajax, as well as presentation layer technologies from companies such as Citrix and AppStream, will make it possible to combine data sets on demand with a minimum amount of programming.

Managed knowledge. As blogging goes mainstream—driven by tools from vendors such as iUpload and KnowNow—companies will discover that they are actually building organic, self-managing knowledge management systems, especially when coupled with enterprise search and business intelligence applications. This means that companies for the first time might actually be able to figure out who knows what, and when, within their organizations in real time.

As great as the last five years have been, the really good news is that they simply won’t compare to the next five.

Michael Vizard is editorial director at Ziff Davis Media’s enterprise technology group. He can be reached at [email protected].