Primer: 64-Bit Processing

By David F. Carr  |  Posted 2003-09-10 Email Print this article Print
 
 
 
 
 
 
 

By definition, a 64-bit processor can process twice as many bits (1s and 0s) of data as a 32-bit chip can in the same number of compute cycles.

This isn't new. No, it's not. Digital Equipment Corp. came out with the first 64-bit processor, the Alpha, in 1992, and Hewlett-Packard, IBM and Sun Microsystems soon followed. What's new is the idea that "industry standard" computers, not just Unix boxes, may process information in 64-bit chunks, as well.

What changed? In 2001, Intel released its Itanium 64-bit processor, prompting Microsoft to get serious about producing a 64-bit version of Windows. Other significant new 64-bit processors include AMD's Opteron and IBM's Power4.

How does it compare with 32-bit? By definition, a 64-bit processor can process twice as many bits (1s and 0s) of data as a 32-bit chip can in the same number of compute cycles. Because it's the number of combinations of bits that really matters, this translates to much more than twice the processing power. Thirty-two bits factors out to 4,294,967,296 possible combinations, or 4GB of data, that can be addressed. Sixty-four, on the other hand, raises the roof to 18,446,744,073,709,551,616. That's 16 exabytes—two orders of magnitude beyond a terabyte.

Why would my company need it? As businesses grow more dependent on complex data analysis for standard operations, fast processors become increasingly necessary. Take a supply-chain-optimization application. Such a program might require sorting through all combinations of inventory item and location. When thousands of items and locations are involved, memory requirements can make data impossibly slow to analyze. And while 64-bit is overkill for most desktop users, it may be necessary for graphic artists and animators—something Apple probably had in mind with the G5 processor, a Power4 variant.

What are the issues? Having a 64-bit chip doesn't automatically make a computer faster. The theoretical limit may be enormous, but the ability of subsystems such as the bus to move data into and out of memory can produce a practical limit. For example, Apple says the 64-bit G5 should eventually support 4 TB of data, but can only handle 8 GB today.

The faster chip also requires that the operating system and applications that run on top of it be recompiled for 64-bit processing. The Itanium does allow 32-bit applications to run unchanged, but with a hit on performance. In fact, 32-bit apps may actually run slower on Itanium than they would on a 32-bit Intel Xeon. AMD took a more evolutionary approach with its Opteron processor which extends a 32-bit design to handle 64-bit operations.

This means that unmodified 32-bit applications don't suffer the same performance penalty as the Itanium.

How much 64-bit software is available? Unix operating systems like Sun Solaris, HP-UX and IBM AIX have been running on 64-bit processors for years, so Unix software vendors supply these products as a matter of course. Windows Server 2003 is the first Microsoft operating system to fully support 64-bit computing (on Itanium).Linux implementations for Itanium also are beginning to appear. Commercial software applications are following slowly. For example, i2 Technologies' Supply Chain Planner for 64-bit Windows on Itanium 2 is currently supported for pilot projects only.

How does Itanium compare with the Xeon? The 32-bit Xeon will almost certainly remain the default Windows server processor for years to come because of its support for existing applications and the fact that many applications don't really need 64-bit power.

However, Itanium allows Windows to break into territory that used to be exclusive to Unix servers, such as large-scale graphics or simulation applications that require each processor to work with more than 4 GB of memory at a time.

Most Xeon-based servers are limited to eight processors; Itanium can support up to 32.


 
 
 
 
David F. Carr David F. Carr is the Technology Editor for Baseline Magazine, a Ziff Davis publication focused on information technology and its management, with an emphasis on measurable, bottom-line results. He wrote two of Baseline's cover stories focused on the role of technology in disaster recovery, one focused on the response to the tsunami in Indonesia and another on the City of New Orleans after Hurricane Katrina.David has been the author or co-author of many Baseline Case Dissections on corporate technology successes and failures (such as the role of Kmart's inept supply chain implementation in its decline versus Wal-Mart or the successful use of technology to create new market opportunities for office furniture maker Herman Miller). He has also written about the FAA's halting attempts to modernize air traffic control, and in 2003 he traveled to Sierra Leone and Liberia to report on the role of technology in United Nations peacekeeping.David joined Baseline prior to the launch of the magazine in 2001 and helped define popular elements of the magazine such as Gotcha!, which offers cautionary tales about technology pitfalls and how to avoid them.
 
 
 
 
 
 

Submit a Comment

Loading Comments...
Manage your Newsletters: Login   Register My Newsletters