The Winds of ChangeBy Samuel Greengard | Posted 2012-01-17 Email Print
WEBINAR: Available On-Demand
Innovate and Thrive: How to Compete in the API Economy REGISTER >
Organizations are awash in data. However, tapping into this valuable business resource to achieve maximum advantage requires a clearly defined strategy, along with the right technology solutions.
The Winds of Change
A growing array of companies and government organizations are turning to big data to redefine their business models. Tata Consultancy’s Viswanathan says that advertisers are sifting through mountains of data to better understand buying behavior and what actually drives results. Retailers are combining and correlating customer behavior, psychographics and customer lifetime events to create more accurate profiles.
Financial services firms are connecting diverse data points to create new services and to sell existing services more effectively. And health care providers are using big data to improve outcomes and cost structures.
Among the companies sold on the concept is Vestas Wind Systems, a Randers, Denmark, operator of wind farms used to generate electricity for utilities. The company, with more than 44,000 turbines in 67 countries, uses huge data sets to better understand where to locate turbines for optimum performance.
The company analyzes 178 parameters, including cloud cover, humidity, solar radiation, satellite imagery, deforestation maps and barometric pressure, notes Lars Christian Christensen, vice president of plant siting and forecasting. What’s more, researchers must examine data parameters hour by hour over a 12-year span. “It’s a huge multidimensional cube of information,” he says.
Vestas turned to a big data analytics system from IBM to provide insights for a database that is expected to reach 20-plus petabytes within four years. In the past, analysts were forced to sift through mountains of data—a process that could take weeks and devour very signifi-cant resources.
Today, Vestas is running the IBM BigInsights software on 1,222 connected, workload-optimized System x iDataPlex servers that make up its Firestorm supercomputer. It is capable of 150 trillion calculations per second and can analyze data sets within an hour in order to determine the best locations for turbines.
“We are able to provide answers for our customers quickly to help them build a business case and revenue-generation plan more effectively,” Christensen says. “The system has reduced the complexity of the planning process immensely. We have transformed the way we handle data and the entire analysis process.”
Accenture’s Curtis points out that big data can present some unique challenges. For one thing, it’s necessary to determine how diverse data sets can be combined to produce new insights. This requires analysts and business experts who can think in innovative ways.
For another, an organization must harvest assorted forms of unstructured data, including video clips, audio files and social media feeds. “There must be a way to identify these files and understand what types of data they provide and how it’s possible to use them effectively,” Curtis notes. Although techniques exist—including the use of metadata—it’s an area that still emerging and evolving.
In addition, an enterprise must sort out governance issues, particularly those centering on which business lines own and manage data and who should have access to transactions. Financial firms that operate different business lines—such as retail banking, commercial banking, wealth management, brokerage and other services—are particularly prone to challenges in this area.
In some cases, the lines can become blurred because data might reside on servers operated by a business partner or service provider. “It’s critical that the data is managed effectively and that there’s one golden copy,” Curtis says. “Governance issues must be sorted out.”