Modeling is not a new concept. In fact, everyone does it without thinking. Recall the invention of the spreadsheet? Before the personal computer revolution, Wall Street analysts performed complex spreadsheet calculations by hand using only a simple calculator. This process was completely inflexible, prone to mistakes, and thoroughly mind-numbing. To make changes to a model (whether to vary inputs or correct mistakes), analysts had to rework the entire thing, a process that – needless to say – was inefficient.
In 1978, Harvard Business School student Dan Bricklin recognized an opportunity to automate this tedious process using software and the rapidly maturing PC. He introduced the VisiCalc spreadsheet to the market, and almost overnight transformed how financial analysts worked. The obvious advantage to Bricklin’s innovation was efficiency. Complex models that once took hours to update could all of a sudden be modified with a few keystrokes.
Not surprisingly, spreadsheets like VisiCalc became the de facto standard for financial modeling, and frustrated business school students and financial analysts quickly put the new technology to use. The demand for spreadsheets was so overwhelming, in fact, that they are frequently credited with creating the initial boom market for business PCs.
But the real revolution that the spreadsheet kicked off wasn’t just about efficiency and automation. By unburdening analysts from the heavy lifting of manual calculations, spreadsheets lowered the marginal costs of evaluating new scenarios from thousands of dollars to near zero. This in turn encouraged experimentation and creativity, and the same employee who once spent days perfecting a single model could suddenly produce several alternatives in a single afternoon.
Spreadsheets kicked off an industry-wide movement towards experimentation that revolutionized how analysts – and the financial services industry – worked. By allowing workers to easily create and analyze the impact of multiple scenarios, spreadsheets and predictive modeling encouraged a culture of rapid prototyping and innovation, or impact analysis, that is as applicable today for converging business and technology as it is for the financial world.
Effective scenarios and modeling must be accompanied by impact analyses, which enable decision-makers to alter factors, create multiple output scenarios, evaluate the end-to-end impact of each scenario, and eventually select and implement the optimal solution. This stands in direct opposition to conventional, linear problem solving techniques, where decision-makers analyze sub-problems at each logical step along the way, and then assume that the overall impact of their choices is the best one.
As with modeling in general, impact analysis can be used to address a broad range of activities. For example, it is often used in supply chain planning for advanced, data-driven calculations that optimize a particular function (such as inventory costs) given unique inputs and constraints (such as market demand, logistical restrictions and manufacturing capabilities).
Impact analysis can address much simpler problems, as well. On Dell Computer’s build-to-order website, potential buyers test multiple PC configurations until they find a good match between the features they want and the cost they can afford to pay.
In both of these cases individuals vary inputs, rules translate these inputs to outputs, and team members compare the impact of multiple scenarios to choose the solution that fit their needs.
In order for impact analysis to work, the scenario being modeled should conform to three guidelines:
Easily identified inputs, rules and outputs – Impact analysis requires defined sets of inputs that are linked to outputs using predefined rules. These inputs and outputs are often quantitative (as in our supply chain optimization problem), but they can also be qualitative (such as our PC configuration options). To produce good results, these criteria – and the rules that link them – must accurately reflect the real world problem.
Multiple configuration options and decision factors – Problems that contain only a few inputs and outputs aren’t suited to impact analysis because the effect of altering inputs is often obvious. When the outputs are less intuitive, impact analysis helps decision-makers’ identify good solutions.
Low-design, high-implementation cost – Scenarios that are inexpensive to design but difficult to implement are ideally suited to impact analysis. It’s unrealistic to contract a builder to construct five houses so that you can choose the one you like the most. It’s entirely possible, however, to commission an architect to draft five blueprints. A person can compare plans, choose a favorite, and give it to the contractor. This is where the synergy between modeling and impact analysis really comes in to play. Predictive modeling is a powerful tool for lowering design costs, and so a crucial driver for impact analysis.