Uncle Sam Must Get Smart With Big Data

By Samuel Greengard  |  Posted 2013-07-18 Email Print this article Print
 
 
 
 
 
 
 
getting smart with big data

Big data is creating both opportunities and challenges for the federal government. The technology has the potential to save $500 billion annually.

By Samuel Greengard

A new study, "Smarter Uncle Sam: The Big Data Forecast," found that the effective use of big data could save the U.S. federal government approximately $500 billion annually. In fact, respondents indicated that their agency could trim an average of 14 percent of their overall budget by successfully leveraging this technology.

According to the study, which was conducted by Meritalk and underwritten by EMC, nearly seven in 10 respondents believe that big data will help the government work smarter by "increasing efficiency, enabling smarter decisions and deepening insight." The biggest opportunities center on the military; intelligence and reconnaissance missions; combating fraud, waste and abuse; and managing the transportation infrastructure.

Overall, 51 percent of respondents stated that big data could help improve processes and efficiencies; 44 percent said it could aid in enhancing security; and 31 percent believed it would be useful for predicting trends.

The biggest question facing government officials is deciding how to proceed with a big data initiative. Remarkably, only 31 percent of respondents believe their agency has an adequate big data strategy in place. However, "This number should increase as officials realize that big data can help agencies innovate within the current budget scenarios," says Rich Campbell, chief technologist at EMC Federal.

Ultimately, government leaders must gain greater familiarity with big data strategies and how to navigate the emerging landscape. The study recommends that officials, including IT executives, research "the art of the possible," take inventory of data assets, identify mission requirements, assess capabilities and current architecture against requirements, and explore which data assets can be made open and available to the public in order to spur greater internal innovation.

Campbell recommends reviewing case studies—such as those available in a recent TechAmerica report, "Demystifying Big Data"—and taking a more comprehensive view of data assets within an agency and across multiple agencies. Moreover, "As we think about the technology and infrastructure needed, we also must consider the expertise needed," he says. This involves hiring entirely new types of IT and data specialists. Moving forward, "Organizations will require data scientists with degrees in mathematics, statistics and science, as well as database skills in software such as SQL," he adds.

Finally, federal agencies can benefit by working with the private sector, focusing on one or two key priorities, and executing appropriate pilot projects. "Everyone agrees that there is an enormous opportunity," Campbell concludes. "Unfortunately, less than one-fourth of all government data currently undergoes any type of analysis.

"Agencies must understand their own data resources and how they can [leverage] them to keep up with the pace of change and deliver a mission advantage."



 
 
 
 

Samuel Greengard is a contributing writer for Baseline.

 
 
 
 
 
 

Submit a Comment

Loading Comments...

Manage your Newsletters: Login   Register My Newsletters