Deploying Analytics to Take Advantage of Big Data

Harvesting massive amounts of data doesn’t necessarily translate into business value. It’s what organizations do with the data from an analytics and intelligence standpoint that really matters. Some companies are going beyond simply collecting big data and are finding innovative ways to manage and leverage their information resources to improve customer service and create new opportunities for business growth.

One of those companies is McKenney’s, an Atlanta-based provider of facility construction, operation and maintenance services. The firm recently integrated data analytics technology from Splunk with its business intelligence software so that it can quickly analyze data from building components such as elevators, security doors, light switches, wall-mounted thermostats and air-conditioners.

The firm’s Automation Controls division designs and implements building control systems that automate the operation of a building’s functions. Until McKenney’s began using the Splunk technology, the vast amount of operational data collected from these systems required many hours to analyze, says Fred Gordy, technology evangelist, Enterprise Intelligence Group at McKenney’s.

“The biggest challenge we faced was being able to line up hundreds of thousands of disparate sensor data sequentially and tag it,” Gordy says. Overcoming this challenge has enabled the firm to better serve its clients. For example, in 2012 McKenney’s was hired to help Gulf Power and its partner, Chevron Energy Solutions, deploy a new energy management system at Eglin Air Force Base (AFB) in Florida, one of the largest military bases in the world.

The Enterprise Intelligence Group used the Splunk platform to help monitor and analyze tens of thousands of sensors and data inputs from heating, ventilation and air-conditioning systems in more than 100 Eglin buildings. The AFB uses McKenney’s system to provide dashboards that help the base’s maintenance staff assess building performance and energy efficiency.

“The base is almost the size of Rhode Island, and it has almost 2,000 power meters being collected by a siloed, proprietary monitoring system and control systems in over 700 buildings,” Gordy says. The data gathered on building performance and energy efficiency—and the associated analytics—has saved the Eglin $2.5 million, he adds.

Using this technology also helps drive business to McKenney’s other departments. For example if boilers, chillers, variable air-volume systems or other building maintenance components are not performing optimally, “We can engage our building service department to fix these hardware issues,” Gordy says. “The customer avoids expensive service calls and increases energy efficiencies.”

The data analytics capability has also opened doors to nontraditional business for McKenney’s. Before using analytics technology, the company’s primary business was installing and controlling building management systems.

“We are now working with parking system companies analyzing traffic and usage patterns and analyzing train data for efficiency,” Gordy says.

Leveraging Hadoop Technology

Another company that’s deployed big data analytics is Edo Interactive, a Nashville, Tenn., provider of a digital advertising platform that connects brands with consumers and harnesses billions of data points to better understand customer preferences and behaviors. From there, the company creates personalized offers and automatically preloads them onto consumer credit or debit cards within its network of more than 200 financial institutions.

“In order to do this successfully, we leverage Apache Hadoop technology,” says Jeff Sippel, CTO at Edo Interactive. Combined with a data visualization system and analytical databases designed in-house using Pentaho tools, “Hadoop has enabled us to better understand customer preferences and behaviors by reducing the time we spend on data preparation and analytics processes,” he says.

The Pentaho product is used to streamline the data preparation and analytics process. Specifically, it extracts, integrates and analyzes nearly 20 terabytes of data.