Harte Hanks Puts Data in Its Place

By Samuel Greengard  |  Posted 2014-11-24 Email Print this article Print
 
 
 
 
 
 
 
Hadoop RDBMS

The marketing services firm turns to Hadoop RDBMS to better scale resources and improve performance, which translates into better results for its clients.

By Samuel Greengard

In today's business environment, data is increasingly at the center of everything. For marketing services giant Harte Hanks, which manages and processes data for a Who's Who of leading businesses, transforming data into marketing results is the key to success and bottom-line results.

"Although we don't own data and simply host it for our clients, our ability to eliminate the noise and produce results is critical," says Rob Fuller, managing director of product innovation at Harte Hanks.

The 30-plus-year-old firm, which has about 5,000 employees and boasts annual revenues of approximately $600 million, oversees databases as large as 20 terabytes. In addition, many of the companies that contract with Harte Hank's require real-time access and capabilities in order to run their business and marketing efforts effectively.

"They need to know how to interact with a customer, [including] the offers they have sent, when the customer last interacted with the firm, what they are interested in and what they already own," Fuller explains.

In addition, clients tap data to drive major campaigns, such as a car launch or a major sale, and to conduct sophisticated predictive analytics to understand what is working and on which channel.

"In a perfect world, we would build distinct systems to handle the different aspects of digital marketing," Fuller says, "but that would get extremely expensive, and there's no way to effectively manage the different systems. It would be next to impossible to keep all the data in sync." What's more, current system performance was slowing, and the firm's clients couldn't use their marketing data in all the ways they desired.

"In the past, none of our solutions scaled elegantly," he recalls. "They created too many limitations for clients. For years, we've been experimenting with tools and technologies to get to real-time performance, but we continued to run into technical and practical limitations."

As a result, Harte Hanks turned to Splice Machine to replace its database with the Hadoop RDBMS solution. The system enables the company to better scale computing resources, while maximizing performance and cost efficiencies with existing hardware.

The Splice solution provides full compatibility with tools from IBM Unica and Cognos, as well as SAS, Oracle and SQL. Using the Hadoop framework, Harte Hanks is now able to add nodes to the cluster with full compatibility to the existing infrastructure, and it is able to handle mixed workload applications (OLAP and OLTP).

The result has been a 3x-to-7x increase in query speeds, and cost savings are approaching 75 percent. There's also been a 10x-to-20x improvement in price performance. The company is now considering tackling even larger data tasks, including handling 100 terabytes of client storage.

"We have achieved qualitative and quantitative benefits that translate directly into better results for our clients," Fuller reports. "They now have a greater volume of data and better data at their fingertips."



 
 
 
 

Samuel Greengard, a contributing writer for Baseline, writes about business, technology and other topics. His forthcoming book, The Internet of Things (MIT Press), will be released in the spring of 2015.

 
 
 
 
 
 

Submit a Comment

Loading Comments...
Manage your Newsletters: Login   Register My Newsletters