Predictive Modeling Changes the Future

 
 
By Faisal Hoque  |  Posted 2010-08-10
 
 
 

Modeling is not a new concept. In fact, everyone does it without thinking. Recall the invention of the spreadsheet? Before the personal computer revolution, Wall Street analysts performed complex spreadsheet calculations by hand using only a simple calculator. This process was completely inflexible, prone to mistakes, and thoroughly mind-numbing. To make changes to a model (whether to vary inputs or correct mistakes), analysts had to rework the entire thing, a process that – needless to say – was inefficient. 

In 1978, Harvard Business School student Dan Bricklin recognized an opportunity to automate this tedious process using software and the rapidly maturing PC. He introduced the VisiCalc spreadsheet to the market, and almost overnight transformed how financial analysts worked. The obvious advantage to Bricklin’s innovation was efficiency. Complex models that once took hours to update could all of a sudden be modified with a few keystrokes.

Not surprisingly, spreadsheets like VisiCalc became the de facto standard for financial modeling, and frustrated business school students and financial analysts quickly put the new technology to use. The demand for spreadsheets was so overwhelming, in fact, that they are frequently credited with creating the initial boom market for business PCs. 

But the real revolution that the spreadsheet kicked off wasn’t just about efficiency and automation. By unburdening analysts from the heavy lifting of manual calculations, spreadsheets lowered the marginal costs of evaluating new scenarios from thousands of dollars to near zero. This in turn encouraged experimentation and creativity, and the same employee who once spent days perfecting a single model could suddenly produce several alternatives in a single afternoon.

Spreadsheets kicked off an industry-wide movement towards experimentation that revolutionized how analysts – and the financial services industry – worked. By allowing workers to easily create and analyze the impact of multiple scenarios, spreadsheets and predictive modeling encouraged a culture of rapid prototyping and innovation, or impact analysis, that is as applicable today for converging business and technology as it is for the financial world.

Effective scenarios and modeling must be accompanied by impact analyses, which enable decision-makers to alter factors, create multiple output scenarios, evaluate the end-to-end impact of each scenario, and eventually select and implement the optimal solution. This stands in direct opposition to conventional, linear problem solving techniques, where decision-makers analyze sub-problems at each logical step along the way, and then assume that the overall impact of their choices is the best one.

As with modeling in general, impact analysis can be used to address a broad range of activities. For example, it is often used in supply chain planning for advanced, data-driven calculations that optimize a particular function (such as inventory costs) given unique inputs and constraints (such as market demand, logistical restrictions and manufacturing capabilities).

Impact analysis can address much simpler problems, as well. On Dell Computer’s build-to-order website, potential buyers test multiple PC configurations until they find a good match between the features they want and the cost they can afford to pay. 

In both of these cases individuals vary inputs, rules translate these inputs to outputs, and team members compare the impact of multiple scenarios to choose the solution that fit their needs.

In order for impact analysis to work, the scenario being modeled should conform to three guidelines:

Easily identified inputs, rules and outputs - Impact analysis requires defined sets of inputs that are linked to outputs using predefined rules. These inputs and outputs are often quantitative (as in our supply chain optimization problem), but they can also be qualitative (such as our PC configuration options). To produce good results, these criteria – and the rules that link them – must accurately reflect the real world problem.

Multiple configuration options and decision factors - Problems that contain only a few inputs and outputs aren’t suited to impact analysis because the effect of altering inputs is often obvious. When the outputs are less intuitive, impact analysis helps decision-makers’ identify good solutions.

Low-design, high-implementation cost - Scenarios that are inexpensive to design but difficult to implement are ideally suited to impact analysis. It’s unrealistic to contract a builder to construct five houses so that you can choose the one you like the most. It’s entirely possible, however, to commission an architect to draft five blueprints. A person can compare plans, choose a favorite, and give it to the contractor. This is where the synergy between modeling and impact analysis really comes in to play. Predictive modeling is a powerful tool for lowering design costs, and so a crucial driver for impact analysis.

Disconnects between business, process and technology are often introduced when individual decisions have unforeseen effects on the blueprint as a whole. Enterprises tend to decompose the problem several times and decide a course of action that probably was suggested in the initial round of analysis.

A few obstacles need resolution before any company can get started with modeling and impact analysis. The most obvious is the common perception that the time it takes to develop a model during the design stage is better spent on implementation. This is due, in part, to previous experiences with models that were frighteningly inaccessible to all but the most die-hard experts. Since non-specialists (a group that frequently includes managers and other authority figures) couldn’t experience their value first-hand, they assumed that the models were a waste of time.

The shorthand solution to this concern is to make the modeling environment friendly enough for a broad range of people to pick it up and experiment according to their own level of comfort. A good example to point back to here is a financial model whose inner workings may be exceedingly complex but whose overall purpose is clearly communicated to a non-technical audience.

In extreme cases, modeling can be a waste of time. This happens when people get stuck in an endless design loop; by continuously tweaking the model in the quest for a perfect solution, they never get around to actually implementing what they’re working on. The way to counter this impulse is by linking a system of real-time monitoring to metrics, goals and objectives that are established at the beginning of the project. This implies a link to both project and performance management that is crucial to any type of modeling.

The other obstacle that stands in the way of modeling and impact analysis is the gaps that exist between multiple models and between models and the real world. These gaps are referred to as white space, and they’re familiar culprits in cases where modeling hasn’t been successful. Typically, the tools that are available to technology employees to model the business, processes, and technology are disjointed, and so they tend to exacerbate rather than overcome the white space problem. 

Most tools are geared either to a particular task (process modeling, object modeling, or knowledge management) or to broad horizontal activities (word processing, drawing or spreadsheets). A consequence of these disjointed offerings is that companies tend to use multiple tools and environments to develop their models. When changes are made in one environment (say a process diagram) they aren’t automatically reflected in other areas (a requirements document or business strategy memo). Without integrated tools, decision-makers must proactively anticipate ripple effects to keep their models aligned.

Once the current business model is understood, enterprise executives can begin to create the business scenario models that form the basis for end-to-end impact analysis.  Each scenario represents a to-be alternative for accomplishing the firm’s goals. The structured and visual nature of models makes it easy for the team to compare these scenarios and eventually combine the best of each, and that equates to what’s best for the organization overall.

----------------------------------

Faisal Hoque is the founder and CEO of BTM Corporation. A former senior executive at GE and other multi-nationals, Faisal is an internationally known entrepreneur and thought leader. He has written five management books, established a non-profit institute, The BTM Institute, and become a leading authority on the issue of effective interaction between business and technology. © 2010 Faisal Hoque