Minneapolis Uses Analytics to Enhance Services

By Samuel Greengard  |  Posted 2014-08-05 Email Print this article Print
 
 
 
 
 
 
 
Minneapolis Smarter Cities initiative

The city adopts an advanced analytical solution to understand trends, boost its ability to make data-driven decisions and take a proactive approach to services.

Tight budgets and growing demand for services place enormous pressure on cities and other government agencies. There's a growing need for deeper insights, while having less time and money to achieve results.

One municipality that's tackling the challenge head on is the City of Minneapolis, which has turned to analytics to address a growing array of tasks, including law enforcement, code enforcement and traffic safety. "We are using analytics tools to work with information in a way that mimics how people make decisions, while delivering better and faster results," says Otto Doll, the city's CIO.

Minneapolis uses IBM's Smarter Cities analytics solution to boost its ability to make data-driven decisions. It switched on the analytics software at the beginning of 2014 and is rolling it out across city departments and other groups on an ongoing basis.

The introduction of sophisticated analytics tools provides new opportunities to discover patterns, trends and potential problems, Doll says. The digital platform also delivers better data surrounding metrics, so that city leaders can track performance and increase their ability to meet objectives. "The goal is to turn data into decisions," he explains.

One way the city uses the analytics tool is to identify landlords who aren't adhering to regulations and laws. Minneapolis is able to pull data from multiple databases in different departments that were not previously linked, as well as from citizen complaints. This makes it easier to spot infractions that might have previously gone undetected—or taken longer to detect.

Other areas of focus include spotting clusters of crime, identifying dangerous intersections and pedestrian areas, managing health and restaurant inspections, and pulling together diverse data sources. These include scraping third-party Websites to better inform citizens about upcoming events, including festivals, block parties and athletic events.

A key element of the software, Doll says, is anomaly detection. For example, when the city examined vehicle collisions with pedestrians by combining 2013 data with historical records and other data sources, analysts discovered that July is a particularly dangerous month.

That led to a further exploration of the topic, and the analysts eventually concluded that various city events attracted a greater number of visitors from outside the central city area—many of whom are not familiar with intersections. Ultimately, the information could lead to changes in signage or markings, as well as other adjustments to the streets.

In the end, Doll says, the cloud-based approach suits the city well. It eliminates the need to manage upgrades and patches, and it delivers new features and capabilities on a regular basis. The city hopes to have all departments linked to the analytics system by the end of 2014. In some cases, the analytics tool has also condensed reporting and analysis that previously took days or weeks into minutes, while serving up better results.

"By saving analysts time, we are able to use their knowledge and expertise to address a greater number of challenges more effectively," Doll explains. "We are able to view data and visualizations that, in some cases, transform decision making."



 
 
 
 

Samuel Greengard is a contributing writer for Baseline.

 
 
 
 
 
 

Submit a Comment

Loading Comments...
Manage your Newsletters: Login   Register My Newsletters



















 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Thanks for your registration, follow us on our social networks to keep up-to-date