Seacoast Bank Cashes in on Data VirtualizationBy Mike Vizard | Posted 2017-01-23 Email Print
The bank moved away from isolated physical data warehouses and built a single warehouse that enables users to self-service their business intelligence needs.
The Seacoast Banking Corp. of Florida relies on five commercial banking centers to support 52 branch locations, all of which need access to analytics applications. Like a lot of organizations, the bank wrestles with data integration spanning multiple applications.
To make that process more efficient, according to Mark Blanchette, Seacoast's vice president and director of business technology and data management, the bank decided to move away from isolated physical data warehouses. In their place, it built a single logical warehouse using a Denodo Platform that enables end users to self-service their own business intelligence needs.
"It's about the unification of our assets into one logical platform," he explains. "Being able to run analytics on one logical layer provides more flexibility."
Banchette admits that it took some effort to educate his colleagues on the value of data virtualization, as they were in the habit of using extract, transform and load (ETL) tools to move data into a data warehouse before being able to analyze it.
Thanks to data virtualization, the bank has reduced the amount of time required to create a new reporting application from what used to take between eight months and a year using ETL tools down to four or five months. Just as significantly, the amount of time needed to respond to a new type of reporting request using analytics and reporting tools from SAS Institute and Tableau Software has been reduced from two days to less than two hours.
At the moment, Seacoast Bank is supporting 32 virtual data marts, which access both internal and external data sources that generate reports needed for credit administration, risk mitigation, internal operations, and even complying with the Bank Secrecy Act in near real time.
Interacting With Data in Real Time
Data virtualization is being more widely adopted these days as IT organizations respond to demands for more agility from business executives. Instead of waiting for IT departments to get them reports that could take a week or longer to create and generate, business leaders now want to interact with data in real time to make better business decisions.
That requires reacting to rapidly changing business conditions and being able to model what-if scenarios based on actual business data. Armed with that data, business leaders can better assess the risks associated with any decision they might make.
In general, Rick Van Der Lans, managing director for R20/Consultancy, says that while data virtualization as a technology has been on a long journey to mainstream adoption, business leaders are starting to realize there's more data than ever that is potentially readily available to them. Rather than making business decisions based on historical data, it's now possible to make decisions based on the latest data available from multiple sources.
"The business wants to be able to gain access to data much more quickly," says Van Der Lans. "Most existing data warehouse applications take too long to meet today's business needs."
Of course, not having to physically move data into a data warehouse provides all kinds of benefits to the IT organization, starting with not having to manage and reconcile multiple copies of the same data. That not only eliminates master data management issues, it also eliminates most of the headaches IT organizations struggle with when trying to prove the chain of custody that exists around any particular set of data.
Add in benefits derived from not having to secure that data and deploy additional IT infrastructure to manage it, and it becomes clear that data virtualization provides a potential return on investment that borders on the incalculable.
The primary reason most IT organizations have overlooked data virtualization, Van Der Lans says, is that there are few data virtualization platforms that can be independently deployed across multiple application environments. Without that capability, it becomes difficult to justify a data virtualization initiative that can only be applied to a limited number of applications, he adds.
For technology leaders, the ability to more easily apply data virtualization across multiple applications will accelerate a fundamental shift in the role IT organizations play in the management of analytics applications. The goal now is to find a way for IT organizations to provide access to those analytics applications with the least amount of intervention on their part.