Exxon

By Mel Duvall  |  Posted 2006-05-06 Email Print this article Print
 
 
 
 
 
 
 

Rocketing oil prices are driving the nation's big oil companies to record profits. And ExxonMobil stands out in the industry. By investing in proprietary systems and ensuring that technology is tightly aligned with its top goals of finding more oil and ga

's Balanced Methodology">
Exxon is proud of its innovation record, Comstock maintains, but he acknowledges that new technology developments are closely monitored by management teams and almost always within the context of a business goal. A new system that monitors the energy consumed by a drilling rig as it digs a new well, for example, is undertaken with the underlying goal of being able to reduce the amount of time it takes to drill a well.

In fact, Exxon says a system called Fast Drill Process unveiled in 2005, which uses sensors placed behind the drill bit to measure factors such as weight on the bit, rotary speed and torque, has allowed the company to trim the time it takes to drill a well by 35%.

One of the methods used by Exxon to gauge alignment success is the Balanced Scorecard system. Balanced Scorecard, a measurement system that was developed in the 1990s by Robert Kaplan and David Norton, requires every action to answer to established corporate vision or goals. It's based on the concept that you can't improve what you can't measure.

Metrics are therefore developed around business priorities, such as improving drilling rates by 35% or reducing refinery operating costs by 10%. Feedback loops are established to determine whether those goals are being met, and to accommodate changes in strategy to make up for shortfalls.

Having defined financial goals allows Exxon, and other companies that have adopted Balanced Scorecard, to better set technology priorities, notes Robert Gold, vice president of technology for the Balanced Scorecard Collaborative, an organization that promotes the use of the system.

Another key method of ensuring alignment at Exxon is by placing people from the departments that will be using the finished applications on the development teams. "We physically take people out of their departments and send them to the research laboratory to help write the code," Comstock says. "That ensures it stays on target and it's usable at the desktop."

Tom Teague, who spent 18 years at Exxon leading various teams in the development and support of some 20 to 30 engineering programs used in the upstream organization, says software development was very much a collaborative process. It could be initiated by the business unit, which brings forward a specific need, or by the research department, which may believe it has a conceived a better way of performing a specific operation.

Teague left Exxon in 1999 to form Protesoft, a Houston company specializing in the integration of software in heavily automated industries.

During his 18 years with Exxon's research arm, Teague says he maintained a network of contacts in the company's various upstream business units, so he could regularly have conversations on the challenges they might be facing.

In one such conversation, Teague learned that the company's reservoir engineers were having difficulty determining the most cost-effective way to increase production from older wells-sites that might not necessarily justify the use of a supercomputer-powered reservoir simulator, which uses computer algorithms and accumulated knowledge about how oil deposits behave to best determine how to produce an oil deposit.

As natural pressure decreases in older oil or gas wells, engineers will usually deploy some method of artificial lift, such as placing a pump at the bottom of the well or injecting gas.

Teague formed a software development team to first determine the business need for the reservoir simulator application, then to determine the requirements. The most important factor, he says, is that an engineer from the affected department-a "champion"-worked directly with the team in the development of the software code.

"Ideas came from all over, but the common thread is that they had a business focus," Teague says. "We didn't work on things just because they were interesting to work on. We worked on things because they could provide a business benefit to Exxon."

That focus on business requirements extends beyond the design and development phase, according to Comstock. Once an application goes into production, it is closely monitored to determine who is using it, when and for how long. Most important, it is measured in dollars and cents.

"We can say, a-ha, it's resulting in this much more production, or they're getting this many more barrels out of the ground," Comstock says. "We take those results and feed them back into the system."

Comstock is responsible for some 30 core applications that allow Exxon to discover new sources of oil and gas, map out the best method to tap those pools, and figure out the optimum rate for pumping the resource to the surface to maximize production and profits.

Included in that group are three key information systems, specialized business intelligence applications developed through proprietary research efforts at the company's upstream research center in Houston, that Exxon believes give it an edge over the competition.

Chief among the various tools the company uses to locate new sources of oil and gas is a technology known as 4D seismic imaging.

In the early days of oil exploration, most companies drilled their wells in places where oil or gas was thought to be located by looking for surface signs.

The wells were shallow, and luck played a big part in success. Explorers usually drilled 10 "dry" holes for every successful find. The first big change occurred in the 1930s when oil companies began using sound waves to better understand the geology below the surface.

In 1967, Exxon took the technology a step further, adding depth to the picture by shooting the world's first 3D seismic survey over an oilfield near Houston. Advancements by IBM, which introduced its 360 line of mainframes in 1964, provided Exxon with the computing power to generate three-dimensional images of the subsurface. Continued advancements in computing power and the algorithms used to interpret the seismic data have greatly improved the detail and accuracy of the images produced by the 3D applications.

The state of the art in seismic imaging can be seen at what Exxon refers to as its Center for Interactive Interpretation (Cii) in Houston. There, geologists and engineers can literally walk through 3D projections of seismic data, powered by supercomputers and advancements in parallel processing. Using a 3D pointer, a device that lets you manipulate objects in a virtual environment, a new well can be jointly planned, and the effects and changes to the reservoir seen immediately.

The latest improvement to the powerful system is the addition of time-lapse, or 4D, seismic. By combining seismic data taken in different months or years, geologists and engineers can essentially monitor changes to a reservoir.

Combined, these advanced technologies have played a significant role in improving Exxon's rate of drilling success, Comstock says, a clear measurement of technology alignment. In fact, Exxon says it now finds oil in just over half of its exploration wells, whereas the industry average during the past 10 years is closer to 35%, according to the U.S. Energy Information Administration.

When you consider that it costs as much as $60 million to drill one deepwater well, that greater rate of success is a huge advantage. It also means Exxon can attempt to drill for oil in places where the underlying geology is much more complex and difficult to assess.

"We end up doing things that are absolutely spectacular," Comstock adds, noting such places as Russia's Sakhalin Island, where the company has drilled horizontally from a land-based platform to pockets of oil located six miles offshore.





Story Guide:
Straight shooter: CEO Rex Tillerson Doesn't Play Games



<12345678>
 
 
 
 
Contributing Editor
Mel Duvall is a veteran business and technology journalist, having written for a variety of daily newspapers and magazines for 17 years. Most recently he was the Business Commerce Editor for Interactive Week, and previously served as a senior business writer for The Financial Post.

 
 
 
 
 
 

Submit a Comment

Loading Comments...
Manage your Newsletters: Login   Register My Newsletters