Disaster Recovery Software: Make a Copy, Stay in BusinessBy Brian P. Watson | Posted 2006-08-07 Email Print
How Real-World Numbers Make the Case for SSDs in the Data Center
It could be a looming threat, like a powerful hurricane. Or an unexpected one, like a fire in the data center. When disaster strikes, data replication tools can help business survive.
Sept. 11, 2001, provided a wake-up call for both information-technology managers and corporate executives who hadn't invested in disaster recovery processes like data replication, which allows companies to copy applications from one computer to another by transmitting data across a local-area or wide-area network.
For years, a large number of companies used tapes as their primary backup, typically having them transported off-site nightly to a separate facility. Businesses then turned to disks and, most recently, to storage-area networks, which, by connecting storage devices, decrease a company's reliance on its local data center.
But today, businesses in hurricane alleys, earthquake corridors and major population centers—and those far from them—are deploying disaster recovery plans complete with software tools that allow them to replicate, or back up, their mission-critical applications to sites away from their primary data centers. In the case of a disaster, companies can transmit vital accounting, project management or transactional systems and records to their disaster recovery facilities, limiting downtime and data loss despite an outage at the primary location.
Technology industry giants such as Hewlett-Packard, IBM and Oracle offer replication programs, but a handful of other players—Double-Take Software (formerly NSI Software), EMC, Network Appliance, Veritas (acquired in 2005 by Symantec) and XOsoft (acquired last month by CA)—grew up through aggressive market consolidation and unveiled a steady stream of new products, giving businesses a wide range of tools to tackle even the worst catastrophe.
For some companies, it takes a disaster to make them realize they need a data replication system.
Severe building damage kept New York-based reinsurance broker Folksamerica out of its headquarters at 1 Liberty Plaza, across the street from the World Trade Center, for more than three months following 9/11. At the time, Folksamerica had a client/server environment at Liberty Plaza backed up on tapes. After the attack, it took 10 days to get the tapes loaded and the company up and running.
When Phil Marzullo joined the firm as chief information officer in 2004, Folksamerica decided, for safety precautions, to move its data center out of the Northeast, where it had been after 9/11, to Littleton, Colo., and reconstruct its storage network. "I think it became apparent that the [disaster recovery] plan was really lacking," he says, adding that he wanted more remote access included into the system.
Folksamerica built a storage-area network with Citrix and Microsoft SQL and Exchange servers, and FAS940c and 960c storage appliances from Network Appliance (NetApp), with matching hardware in New York and Colorado. To transmit vital applications such as accounting, claims processing and e-mail systems from New York to Colorado, Marzullo used NetApp's SnapMirror, a software program that replicates data across the storage system.
Marzullo cited NetApp's flexibility to customize products and its easy-to-manage operating system for its storage-area network and network-attached storage, a system that connects multiple computers to storage devices over a network. If another disaster strikes, Marzullo says Folksamerica can recover its entire infrastructure in four hours instead of 10 days. "The whole purpose of this was to put the business in a position where it could continue to operate," Marzullo points out. "It was kind of a no-brainer."
James Zeller rejoined Chaffe McCall, New Orleans' oldest law firm, as senior network manager just days before Hurricane Katrina struck in late August 2005. His first job was helping to assemble backup tapes and disks that held the firm's e-mail records and business and law documents, so that a manager could take them to the firm's Baton Rouge, La., office.
Buts the drives in Baton Rouge weren't compatible with the tapes at company headquarters. It took Zeller and his team a week to get the firm up and running after Katrina.
Everybody knew the city was below sea level, but no one thought something as horrific as Katrina could actually take place, Zeller says. So, two months after the storm, when the firm reoccupied its headquarters, Zeller and his team investigated options to better protect, back up and access critical applications like document management and e-mail.
The law firm purchased XOsoft's WANsync to copy applications and records to the organization's Baton Rouge office, which Zeller and his team chose as the backup facility. Zeller looked at a few vaulting software tools, which back up and encrypt data at an off-site location, but favored XOsoft because he and his team were able to install the data replication program without any setbacks and learn the commands quickly.
While he hasn't analyzed the firm's return on investment, Zeller says the capability to replicate vital applications off-site helps the firm feel confident it will weather the next storm. "What it removes is the uncertainty of 'what now?'" he says. "The reality of it is, until you're in the shadow of a disaster, things can get cut from budgets and hardware gets older, and your plan starts to fall apart."
Technology managers are usually the first to recognize the importance of—and push for—replication capabilities, says Tom Pettibone, managing partner of Transition Partners, a management consultancy.
But taking on disaster recovery projects isn't just something for technology managers, he says. Top officers like chief executive and financial officers need to take responsibility, as do board directors.
According to Pettibone, a former chief information officer of Philip Morris and New York Life, "The question for CIOs becomes: Are you willing to stand up and defend [the cost]?"