Feds Search for Efficiency
Federal government agencies are under constant pressure to keep costs down and run processes more efficiently, so many are deploying IT initiatives to help deliver cost-effective operations that are sure to please taxpayers.
“The federal government is challenged by significant budget constraints that will necessarily impact the role of IT in almost every agency,” says Andrea Di Maio, vice president and distinguished analyst at research firm Gartner, who covers IT in the government.
“This is similar to what happened in some of the states, such as Florida and California, after 2008, and to what is likely to happen, albeit with greater severity, in Europe as a consequence of the situation with sovereign debts,” he says. “IT will have to demonstrably contribute to increasing productivity and to ensuring the sustainability of government services and operations.”
At the same time, Di Maio adds, IT needs to remain affordable, so cost optimization will be essential. This can be achieved through better acquisition and delivery models, as well as tighter project and risk management.
“This is a very different scenario than the ones we have seen in the past, where the focus was either on reducing IT costs or investing in IT to reduce business costs,” he says. “The future is about doing both at the same time.”
EPA’s Virtualization Efforts
Some federal agencies, such as the U.S. Environmental Protection Agency, have launched virtualization efforts to improve IT efficiency. The EPA is a physically decentralized organization with 25 major facilities, including its headquarters in the Washington, D.C., area, 10 regional offices and 13 major research centers.
The agency operates 78 data centers and server rooms located in 66 buildings across 48 cities in 31 states and territories. Since 2007, the EPA has taken a series of “optimization initiatives” to consolidate data centers, use industry best-management practices and deploy virtualization across its IT infrastructure.
“Virtualization is a key component of the EPA’s effort to reduce total server count by almost 50 percent,” says David Updike, acting director of the EPA’s National Computer Center. “Virtualization provides the foundation for resource sharing and consolidation within server rooms and across data centers.”
In 2009, the EPA began migrating its x86-64 servers to virtualized platforms, including VMware, Dell servers and a variety of storage platforms, such as HP 3PAR, EMC and EqualLogic. “These virtualization efforts are paired with infrastructure-refresh efforts so they can be financed within existing operating budgets to maximize return on investment,” Updike says.
The EPA has achieved substantial gains in virtualization. Nine percent of its physical servers are virtual machine hosts, and 32 percent of its servers are virtual machines. By 2015, the agency plans to increase virtual hosts to 30 percent of physical machines, with 60 percent of its servers operating as virtual machines.
Another key element of the consolidation effort is network optimization, because bandwidth is a critical risk factor for server migration, Updike says. The agency’s network optimization project involved moving its WAN and Internet access services—provided via a U.S. General Services Administration (GSA) contract with Networx—to Networx’s commercial cloud services. The EPA completed the initial transition in March 2011, and it continues to expand the use of cloud services provided under the contract.
Another effort involves email optimization, which will consolidate email from more than 180 Lotus Notes servers distributed across 45 locations into a private cloud infrastructure across the agency’s four primary data centers. The foundation infrastructure for this setup is VMware clusters hosted on Dell servers with HP 3PAR storage, EMC Data Domain backup, Cisco switching and F5’s Big-IP load balancing.
“This initiative modernizes, standardizes and improves EPA’s email service,” Updike says, adding that it will also result in substantial reductions in servers, storage and energy consumption, while facilitating migration to external cloud services beginning in 2014. When completed, the consolidation effort will eliminate 150 physical servers, reduce storage by 50 percent and eliminate an estimated 350,000 kWh a year in energy usage.
The EPA has also launched enterprise continuity of operations (COOP) and disaster recovery (DR) initiatives, which provide remotely accessible data and applications to support continued operations and emergency response to EPA regions and field offices. The goal of these projects is to provide for COOP and DR using shared services hosted in the four primary data centers.
COOP and DR services are currently provided using site-specific solutions, Updike says. He notes that the initial provisioning of enterprise COOP and DR at the four primary data centers by 2013 is a key component of the agency’s data center optimization and server reduction strategies.
Turning to the Cloud
Meanwhile, other government entities have begun migrating to cloud-based services in an attempt to reduce costs and improve agility. Idaho National Laboratory (INL), a U.S. Department of Energy nuclear research and development facility, with help from IT-services provider Unisys, will soon begin transitioning thousands of users to the cloud-based Google Apps for Government messaging service.
“We want to be flexible, nimble and cost effective as we develop capabilities for lab employees,” says CIO Brent Stacey. “The lab’s goal is to invest in a capability rather than a technology. We don’t want to manage the infrastructure.”
INL’s move to a managed service is intended to support the organization’s future use of messaging and video conferencing and potentially create new ways to collaborate, Stacey says. The move is also intended to support redundant backup efforts.
But moving to the cloud doesn’t come without difficulties. “As we began this journey, we recognized the fact that there would be a number of challenges to overcome as we moved toward a cloud-based solution,” Stacey says. These include cultural change and the need to address security issues. INL has formed a cross-organizational team to address these challenges.
INL is also using social media tools—such as Flickr, Facebook, LinkedIn, Twitter and YouTube—for recruiting, media communications and engaging the public. “The lab is looking at the future of messaging, which means adoption of a mobile environment that supports a flexible workforce, rather than a static infrastructure,” Stacey says.
IT initiatives are also creating new efficiencies at the Department of Defense’s (DoD) Defense Information Systems Agency (DISA), a combat support agency that provides network, computing infrastructure and enterprise services to support information sharing and decision making among U.S. military forces.
To that end, DISA has developed the National Senior Leadership Decision Support Service (NSLDSS), a secure, Web-based thin-client system that provides senior leaders in the DoD with rapid access to, and visualization of, data. The tool—JackBe Presto, a mashup service for composing services and aggregating data—allows users to more easily collaborate and make informed decisions related to a variety of missions and events, says Carlos Vera, advanced concepts engineering chief at DISA.
The system includes open-source service infrastructure implementations, such the Attribute-Based Access Control (ABAC) service. It leverages existing DoD services, such Net-Centric Enterprise Services’ (NCES) messaging and the National Geospatial-Intelligence Agency’s mapping services.
The NSLDSS services run on virtual machines with a Red Hat Enterprise Linux operating system. They are load-balanced between several Defense Enterprise Computing Centers (DECCs) to ensure the constant availability of services.
DISA deployed the system to meet the growing need for rapid access, visualization and mashup of data required for operational decisions. “The Defense Department needed a way to make sense of this data in supporting senior decision-makers during events of national significance,” Vera says. “DoD senior leadership required a dynamic situational awareness solution. This [system] can dynamically aggregate data into a user-defined operational picture of an event to be shared with operations centers.”
NSLDSS can now deliver real-time information, “resulting in
improved military response times, improved global situational awareness and
decision quality, and enhanced operational effectiveness,” Vera says. In addition,
the virtualization of these services has allowed DISA to quickly deploy up-dates,
without interruption of service, and provide greater availability times, as
well as the ability to more easily
monitor services to meet end-user needs.
Pursuit of Value
Federal agencies will continue striving to meet user needs while keeping IT costs under control. Di Maio of Gartner expects to see a “more mature approach” toward the adoption of cloud computing, driven less by the need to comply with federal mandates and more by the pursuit of value and reduction in the cost of IT service delivery.
“There will be greater focus on information management and social network analysis to extract value from information in order to better close the productivity gap, as well as greater focus on collaboration across agencies and with external stakeholders as a way to find more effective solutions to problems under significant resource constraints,” Di Maio says.
There will also be much more attention paid to the use of open data. So far, that is used mostly to comply with the open government directive across agencies, as well as on the use of social media, both externally and internally, Di Maio adds.
“The adoption of consumer technology—such as end-user devices, some mobile apps and social media platforms—will increase to more effectively support teleworking, but also to empower employees and encourage them to become active participants in the innovation process that is required to ensure government sustainability,” Di Maio concludes.