Companies Grapple With Big Data Challenges

 
 
By Samuel Greengard  |  Posted 2013-10-29
 
 
 
big data challenges

By Samuel Greengard

As organizations attempt to navigate the information age, executives are discovering that the biggest obstacle isn't collecting data or finding ways to manage and store it efficiently. It's constructing a framework that allows business and IT leaders to connect all the dots and put all the data to maximum use.

"There is a growing need to analyze many different types of data and to use it to make decisions more quickly and for entirely new types of processes and events," says Vincent Dell'Anno, leader of the big data practice at Accenture.

It's no small task. Across a wide swath of industries and businesses, there's a pressing need to make sense of an increasingly complex and chaotic business environment. Yet, it's crucial to think beyond the vagaries of what big data means and what tools it requires.

Executives must build a strategy and framework that plugs into the "velocity of analysis and the velocity of actions" required in today's world, says Josh Greenbaum, principal at Enterprise Applications Consulting and author of IEEE's "Computing Now" blog.

This new world of business and IT requires radically different thinking, new processes and job skills, and innovative technology platforms to deliver the desired results and return on investment. Big data is also affected by a number of factors, including mobility, cloud computing, social media and a regular stream of other unstructured data.

There's a strong need to build taxonomies and data governance structures for all the data an organization possesses, Greenbaum explains. "It's important to construct a conceptual framework based on the business problems an organization is attempting to solve," he adds.

Framing Data Challenges

Although the term big data has gained widespread acceptance, Greenbaum believes it's not the best way to frame today's data challenges. "Big data is not a clean-cut technology or space," he says. "The term doesn't help businesses build a strategy or assemble the right collection of tools, technologies and assets required to drive business performance."

At the heart of the problem, he says, is that vendor tools and technologies alone cannot address business needs. An enterprise must identify processes, workflows and methodologies that it can use to deliver improved results. In fact, he prefers to call the task "big analysis."

Some disciplines—namely fields such as astronomy, meteorology, oil and gas exploration, and engineering—have long relied on huge data sets to solve problems and build models. But now other organizations are attempting to sift through large and complex data sets in order to glean answers or insights that were once unimaginable.

Along the way, many organizations are looking to build a real-time enterprise that's able to react effectively to changing conditions, events and consumer behavior. They're looking to ratchet up innovation and returns through big data.

Today, sophisticated social listening systems can predict emerging trends and changing tastes in products, services and overall attitudes. In addition, businesses can analyze social conversations to generate leads and marketing strategies.

At the same time, health care providers are turning to sophisticated data analysis systems that gather data from a variety of sources in order to identify patterns. This helps them understand how and when different procedures, therapies and medications work.

To be sure, businesses from a wide range of fields are benefiting from big data initiatives. At Lonza PharmaBiotech, a Swiss global supplier of products and services to the life sciences industry, a growing need to address bottlenecks and latencies in data processing pointed to big data.

The company, headquartered in Basel, Switzerland, wanted to achieve productivity gains through the deployment of self-service business analytics, and enable decision-makers at all levels of the organization to engage in analysis that would create value and deliver cost savings. The firm has approximately 10,000 employees scattered across the United States, Asia and Europe.

In 2011, the company introduced a comprehensive global business analytics program, Adeptia's Enterprise Business Integration Management Suite, to address the challenges in a more sophisticated way. "The goal was to transform Lonza from a company that was rich in data but poor in information," explains Peter Mueller, manager of the business analytics global program. The initiative focused on three primary areas: metrics and operations data, smart information that could guide processes, and insights that could guide major business decisions through predictive analytics.

For example, Lonza now uses big data to understand the effectiveness of an internal Corrective Action Preventative Action (CAPA) process. It involves a highly structured approach that's designed to identify the root cause of process issues and problems related to nonconformity. (The method is required by certain regulatory agencies.)

In the past, internal analysts were forced to sift through large volumes of data in an attempt to identify issues and potential preventative actions. However, some measurements and analysis weren't possible.

That's no longer the case. The company now funnels data from dozens of sources through the analytics software. The results are visible instantly from a series of interactive scorecards and dashboards, and managers and others can take immediate action.

"We can see the effectiveness of our quality system—something that wasn't possible in the past," Mueller says. Moreover, employees using the software can run what-if scenarios and understand how different decisions and actions impact results. "We have the data analysis tools to understand quantitatively what would potentially occur if we decide to make a change," he adds.

The Adeptia system extracts and collects millions of records—both structured and unstructured data— from databases, enterprise applications and other sources across the company's facilities, Mueller reports. Employees can view results based on a number of criteria, including how countries or local offices are performing. The system also standardizes reporting and processes so that there's a high level of consistency across the company.

"The data is different, but the scorecards and information assets all look the same," he says.

Brewing Better Data

Green Mountain Coffee in Waterbury, Vt., is another company that's embracing big data. The firm, which has 20 different brands and more than 200 different beverages, has embarked on an initiative that taps into both structured and unstructured data from audio and text analytics to boost intelligence about customer behavior and buying patterns. The system relies on a Calabrio Speech Analytics solution to glean insights from multiple interaction channels and data streams.

"In the past, when customers called into the contact center, there were questions we couldn't answer," recalls Nate Isham, a level three network engineer. "They included how many people were asking for a specific product, which products they [more] questions about, and which products and categories generated the most confusion." Big data and analytics produced insights along with "solid information to make decisions."

As a result, the company is able to produce materials, Web pages and database entries that help representatives do their jobs more effectively. Moreover, "We are able to catch issues quicker and prevent them from becoming bigger issues for our customers," he says.

The company is now plugging the data into everything from manufacturing and operations to marketing. That could drive changes to areas including product packaging to advertising campaigns. The end goal, Isham says, is to "drive action, not only insight. We are attempting to transform data into information and then knowledge. When we reach that point, we're able to take actions that benefit the customer."

Enterprise Applications Consulting's Greenbaum says that organizations must ultimately adopt a mindset that focuses on business results: "The real issue isn't, What's in my data? It's, What analysis do I want to do?" It's not so much about the size of data sets as it as how to connect them in more innovative and creative ways to answer key business questions. In some cases, this might translate to small data sets; in others, vast data pools.

Within this framework, IT must support the line of business. It's critical to break down silos, ensure data integrity and quality, and construct systems that allow data to flow through the enterprise.

Achieving a real-time enterprise is the end goal, Accenture's Dell'Anno points out. "The idea of data moving through the enterprise in a linear way and landing in a data warehouse is changing," he says. "There's a need to identify choke points and assemble a cross-section of data elements from across the enterprise."

In many instances, this requires a fundamental rethinking of data sources and types. "The issue isn't whether data is structured or unstructured," Dell'Anno says. "It's figuring out all the different data sources to tap internally—and from the Internet and beyond—to help you shape context about your clients and customers."

As organizations grapple with continued data growth, the proliferation of clouds and increasingly tight IT budgets, the task of harnessing big data won't get any easier. Determining which technologies to use, including popular open-source tools, can also prove challenging. In the end, he believes that there's a need to re-examine everything from infrastructure and data collection strategies to the types of skill sets the organization seeks from outside and develops through training.

"The goal is to find the right combination of data, thinking and technology to fundamentally change customer relationships and the business," Dell'Anno concludes.