Two-Pronged Challenge: Storing Large Amounts of Data for the Long TermBy Elizabeth Millard | Posted 2008-04-25 Email Print
WEBINAR: Event Date: Tues, December 5, 2017 at 1:00 p.m. ET/10:00 a.m. PT
How Real-World Numbers Make the Case for SSDs in the Data Center REGISTER >
Enterprises must find innovative ways to file archives in light of increasing compliance, productivity and cost-control needs.
For the enterprises that choose to store data for decades, finding a long-term solution can be challenging, particularly as media changes.
Regulatory mandates like the Health Insurance Portability and Accountability Act (HIPAA) or the Sarbanes-Oxley Act, as well as awareness about potential litigation, have already caused public companies to store data for longer amounts of time and in a secure fashion, but many enterprises are using a strategy that goes above and beyond what’s simply required.
Data archiving management─as opposed to backup, which creates a temporary copy of a data that’s usually overwritten later─has become a source of concern at companies that want data stored for decades, not just a few years. But the sheer amount of data being produced and the uncertainty of long-term media could present a two-pronged challenge for the future.
According to a recent Enterprise Strategy Group (ESG) report on file archiving, many companies are grappling with explosive growth in file-based (also called “unstructured”) content, such as Web pages, word processing documents, spreadsheets, presentations, multimedia files and other proprietary file formats with industry-specific applications. The research firm notes that enterprises have to move beyond rudimentary approaches to file archives in order to address increasing compliance, productivity, and infrastructure cost-control requirements.
Other ESG research shows that current archive practices are usually based on tape, or enterprises have archived data mixed with mission-critical data on primary storage systems. These two approaches aren't ideal. So, what’s an enterprise to do if there’s data that needs to be locked down and kept safe until 2020 or beyond?
“It’s truly a real problem, and it’s going to become even more of one in the next five to seven years, as the drives start becoming outdated or just die from age,” says Paul Clifford, founder of technology consultancy Davenport Group.
The situation directly addresses several aspects of a company’s technology strategy, such as performance requirements, system scalability, data availability and redundancy, says Bob Picardi, COO at RAID Inc., a managed storage service provider. It will also keep the debate about tape technology going strong, he adds.“There’s the mindset of tape being the least expensive alternative for this kind of storage, but as disks get more utilized, tape could become a thing of the past.”
Reliability, in particular, is the biggest issue when talking about archiving since it’s already such a problem when it comes to backups, notes Chris Cummings, senior director of data protection solutions at NetApp. If a company has trouble making a reliable temporary repository of data, then creating a solid, decades-ready data pool will be even more daunting.