Gotcha! Downsides of Using Mainframe Databases

By David F. Carr  |  Posted 2002-08-09 Email Print this article Print
 
 
 
 
 
 
 

What's the outlook for IBM's IMS and Computer Associates' IDMS?

Did You Know That ...

Though commonly dismissed as legacy systems, IMS and IDMS databases aren't easily discarded

IBM's IMS and Computer Associates' IDMS are legacy technologies in the sense that their roots stretch back to before relational databases and the Structured Query Language (SQL) became corporate standards. IMS is a hierarchical database, and IDMS uses the network database model.

But there also are strong reasons to avoid migration to a relational database, or to at least postpone it. These legacy databases are battle-hardened survivors whose dependability and performance counts in their favor. Most of the applications currently running on IMS or IDMS have stayed with these technologies for a reason. Typically, they are high-performance transaction processing applications that demand mainframe power. Usually, the alternative is to move to IBM's mainframe version of DB2, as opposed to a relational database on Windows or Unix.

Humana CIO Bruce Goodman says health insurance claims-processing is exactly that kind of application. "It took 25 or 30 years to get mainframes to where they were as reliable as they are today," he says. He is gradually migrating toward a DB2-based mainframe platform while continuing to maintain older applications that rely on IDMS.

Peter Armstrong, an IMS expert who serves as director of corporate and technology strategy at BMC, says his company doesn't take sides on the issue of whether to convert. Personally, he sees reasons not to. "A customer in Australia told me it would take them three years to convert. I visited every year, and it was always another three years," he says. Many IMS users operate on the principle of "if it works, leave it alone."

Converts must be prepared for trade-offs

As a rule of thumb, IMS or IDMS applications that are converted to run on DB2 will require at least double the computing horsepower, meaning a substantial mainframe upgrade. The manpower required also tends to be higher.

John Wheeler, an IDMS consultant who says he has been working with the technology "since the day Richard Nixon resigned," tells of one client that wound up needing four times the database administration staff after conversion. Another client was achieving a response time of 1 to 2 seconds on a critical application using IDMS, but the best DB2 could manage was 25 seconds—a difference that killed the migration project. Wheeler recommends using a carefully designed pilot project to identify such problems early.

Questions cloud the future of IDMS

IDMS users express doubt about how much energy Computer Associates will invest in continuing to modernize and support the technology (IMS users don't seem to have the same doubts about IBM). The fear is that an eroding customer base will cause CA to further scale back. On the other hand, IDMS users say these predictions of doom have been in the air for a long time—and the technology hasn't gone away.

Differences in underlying database structure do matter

Relational databases are organized around a mathematical theory that aims to maximize flexibility.

There wasn't as much theory behind IMS, which put a premium on performance over flexibility. Its hierarchical approach puts every item of data in an inverted-tree structure, extending downward in a series of parent-child relationships. This provides a high-performance path to a given bit of data.

The IDMS network database model allows for more complex, overlapping hierarchies. IDMS can also mimic a relational database, although some of this functionality relies on an add-on product, IDMS Server.

While middleware may mask the differences, IMS and IDMS users say it's important to understand that applications based on these technologies have been subjected to years of optimization according to different principles than those that govern DB2—work that tends to be lost in a conversion.



 
 
 
 
David F. Carr David F. Carr is the Technology Editor for Baseline Magazine, a Ziff Davis publication focused on information technology and its management, with an emphasis on measurable, bottom-line results. He wrote two of Baseline's cover stories focused on the role of technology in disaster recovery, one focused on the response to the tsunami in Indonesia and another on the City of New Orleans after Hurricane Katrina.David has been the author or co-author of many Baseline Case Dissections on corporate technology successes and failures (such as the role of Kmart's inept supply chain implementation in its decline versus Wal-Mart or the successful use of technology to create new market opportunities for office furniture maker Herman Miller). He has also written about the FAA's halting attempts to modernize air traffic control, and in 2003 he traveled to Sierra Leone and Liberia to report on the role of technology in United Nations peacekeeping.David joined Baseline prior to the launch of the magazine in 2001 and helped define popular elements of the magazine such as Gotcha!, which offers cautionary tales about technology pitfalls and how to avoid them.
 
 
 
 
 
 

Submit a Comment

Loading Comments...

Manage your Newsletters: Login   Register My Newsletters