<img alt="dcsimg" id="dcsimg" width="1" height="1" src="//www.qsstats.com/dcs8krshw00000cpvecvkz0uc_4g4q/njs.gif?dcsuri=/index.php/c/a/Projects-Management/Taking-the-Measure-of-Real-Hot-Spots&amp;WT.js=No&amp;WT.tv=10.4.1&amp;dcssip=www.baselinemag.com&amp;WT.qs_dlk=XthRxdIeN@EXZ3DxB6NOYAAAAAY&amp;">

Taking the Measure of Real Hot Spots

By Baselinemag Print this article Print

Three Mile Island taught this technologist what it takes to get to the core of problems.

Twenty-five years ago last month, I was at Three Mile Island. I was 21 years old and it was three days after the partial meltdown of the reactor core.

I was a nuclear engineer at a nearby company, an expert in measuring radiation. I'd done work to determine where nuclear materials were disappearing, because the industry historically hadn't kept track as accurately as one would expect. We knew we'd lost a lot of material—a scary amount in fact. But when we said "missing," we assumed much of it was measurement error, something we needed to get under control. But no one knew where to look or how to count it up.

I helped develop some of the technologies to gauge nuclear materials inside large, irregularly shaped containers. Certain substances within the containers mask the amount of radiation actually there, but by measuring the readings from two different locations, you can roughly triangulate where the material is and how much of it is there. With effort, you can decrease the measurement error by an order of magnitude or two—a significant drop. You don't usually see that level of error removed from an equation.

I was at Three Mile Island for a week straight, working twelve-hour shifts, around the clock. Measurement was the challenge. No one knew precisely what was inside the reactor building. They didn't even know what it was they didn't know. They were just absolutely scared.

When we first got to the site, we were trying to figure out ways to measure all the different spots. We sent people in to collect water and air samples, then analyzed the results on gamma-ray spectrometers to see what was in them—to tell whether the radiation was coming out of the core.

We wanted to trace not just where the material was, but what was happening in the core—someplace we couldn't reach ourselves. Were the fuel rods actually being damaged? Was that material now melting rather than being contained?

At one point, I went into the reactor building, but not where it was highly contaminated. A suit keeps you from becoming contaminated, but radiation still goes right through you. We were too busy trying to do too many things as fast as we could; we didn't take the time to think about that.

Business runs on basically the same precepts. You're often in a crisis, dealing with information buried within that needs to be reached and measured—and acted upon.

When I first began working at Commonwealth Edison, the company had a massive store of underutilized data. The existing systems were great at processing transactions, but really bad at analytics. There was a core of data that needed to be assessed. I put together a dozen people to very quickly suck data out of the mainframe and put it into a more maneuverable series of 60 SQL server data stores on Intel-based platforms. The team delivered valuable visibility into corporate data, such as real-time views of reactor production. Word of the ability to deliver applications in days compared to many months spread rapidly. Soon we were building applications throughout the company.

At Pacific Gas & Electric, the first thing I did was interview everyone in the technology department. Literally, everyone: all 1,500 people over the course of two years—ten people at a time, several times a week. The critical information there was within these people—and I needed to get it out. Their insight enabled us to improve service, as measured by executive satisfaction surveys, while reducing overall technology spending by $120 million annually. And we were finally able to replace the customer and billing system. Five previous attempts had failed, mainly because utility billing systems are as complex as any piece of code. PG&E's, with about 15,000 function points, was 35 years old and was written in Assembler and COBOL, for which much of the source code had been lost. Deregulation meant various companies were billing for generation, delivery, marketing and sales of electricity, sometimes on separate bills, sometimes as line items on one bill. Through all that, we still had to send out eight million bills a month.

-written with Joshua Weinberger

This article was originally published on 2004-04-04
eWeek eWeek

Have the latest technology news and resources emailed to you everyday.