First, Re-evaluate Existing Processes

By Thomas Boyce  |  Posted 2008-02-21 Email Print this article Print
 
 
 
 
 
 
 

When Thomas Boyce arrived at the National Institutes of Health in May 2005
as a senior fellow at the Council for Excellence in Government and program manager for the Electronic Research Administration, he faced the monumental task of transforming a paper-based organization that processed up to 3 billion forms a year into a humming digital machine. Achieving that goal required major cultural changes as much as it did adopting and developing new technologies. Boyce reflects upon his challenges and resolutions for taming the paper beast.  

First, Re-evaluate Existing Processes

In 2004, the NIH noticed that substantial increases in the eRA program budget were not yielding expected results. In fact, the frequency of both missed software delivery deadlines and system outages was increasing at a disturbing rate.

Despite these concerns, senior management announced in August 2005 that NIH would lead the government’s migration to an all-electronic grant-application process. This new system, one piece of the larger eRA program, was dubbed eSubmission. It required converting our systems from a paper-based process, which relied on data entry and scanned applications, to a system that would accept the equivalent of billions of pieces of paper annually via an XML data stream.

My staff assured me they had been planning for years for the advent of electronic applications. In fact, the system had already accepted almost 200 applications electronically via a pilot program. Nevertheless, with the first production round of 1,800 or more applications only three months away, bugs were still common and substantial portions of software required rewriting.

The need for a fundamental process change quickly became evident. At one of our weekly status meetings, I asked about interdependencies and the master project schedule. After the meeting, a lead staff member took me aside and whispered, “We don’t have an integrated project schedule. In fact, we don’t really have individual project schedules. We create these status reports from staff input on a weekly basis.” When I asked how this was possible, given the size and complexity of the program, the staff member said he had asked the same question when he joined the program, and that I would “get over it,” as he had.

That’s when I began to examine program procedures and delve into the processes being followed. I discovered that the program did have a procedures guide, but there were two major problems. One, it was overly complex. Two, it existed mainly as shelfware: Few people were aware it existed, and no one used it regularly.

I immediately implemented a simple yet effective risk management approach using Excel spreadsheets. We relied on rankings of probability (high/medium/low) and likely impacts. It was essential to engage the staff in identifying areas that needed immediate attention and to introduce more structured approaches to running the program.

In addition, I announced that, within several months, all staff members were to begin using Microsoft’s Enterprise Project Server to record the time they spent on various projects and to verify contractor invoices against tasks and time recorded. I was struggling to identify all the program activities, and felt we could take a huge step in the right direction by getting all 300-plus employees to record their activities.

Contractors helping us implement this new policy said my execution timetable was unreasonable. They argued that they needed time to identify the most likely activities across the entire program and develop project templates before we could move forward. Nevertheless, we managed to produce the first iteration of Project Server on time.

Furthermore, I placed limits on the scope and schedule of our software releases. Until this point, my team and the NIH had boasted about their ability to release new software daily, claiming that this release rate reflected their flexibility and agility. I was convinced—and was later proven correct—that this daily rate was symptomatic of the lack of process control.

We used the eSubmission project as a pilot for the implementation of these new procedures. Although the scope of the software revisions was extensive, the project team quickly began to see fewer bugs and was forced to deal with far fewer changes. The success of the eSubmission project led other departments to begin adopting some of the new practices.



<1234>
 
 
 
 
Boyce is senior fellow at the Council for Excellence in Government and program manager for the Electronic Research Administration. He is transitioning to a new job as deputy chief information officer and director of the Office of Information Systems at the Nuclear Regulatory Commission.
 
 
 
 
 
 

Submit a Comment

Loading Comments...

Manage your Newsletters: Login   Register My Newsletters