Regulation Redux

Are you ready to declare your company secure against attacks from cyberterrorists?

If you’re not, get moving. The odds are increasing that in the not-so-distant future, legislators will make corporate America adhere to yet-to-be-defined best practices in cybersecurity.

Just as the Sarbanes-Oxley Act of 2002 is designed to assure investors that financial records of a corporation are properly prepared and accurate, and the Health Insurance Portability and Accountability Act mandates better procedures for maintaining and exchanging information on medical patients, the processes by which you secure your data and computing resources may be the next facet of your operations to face compliance legislation.

Rep. Adam Putnam (R-Fla.) last fall drafted the Corporate Information Security Accountability Act of 2003, which would require companies to button down their information systems. The bill has not yet gone before the House of Representatives, but many of the proposals in Putnam’s draft as well as other recommendations are being batted about in a working group created by the subcommittee Putnam chairs, the Government Reform Subcommittee on Technology, Information Policy, Intergovernmental Relations and the Census.

In the name of protecting national infrastructure, you may be asked to conduct annual security audits, produce an inventory of key assets and their vulnerabilities, carry cybersecurity insurance and even have your security measures verified by independent third parties, if the core features of the proposed legislation make it to the floor of the House.

The work is proceeding. In April, the working group submitted 23 suggestions to the subcommittee, including a provision that would shield companies from large, punitive lawsuits over security breaches.

What’s at stake, in Putnam’s view, is domestic security. Not only could terrorists take down your systems, they could also use your computing resources to attack federal, state and local computer networks.

Putnam’s subcommittee resides under the Committee on Government Reform, headed by Rep. Tom Davis (R-Va.), who sponsored the Federal Information Security Management Act of 2002 (FISMA), which requires federal agencies to identify information security risks and fix problem areas.

Putnam’s staff is evaluating the proposals and will kick them back to the working group, which includes representatives of 22 trade associations, ranging from the National Association of Manufacturers (NAM) to the Business Software Alliance to the U.S. Chamber of Commerce. Few of the members have actually implemented technology systems and security controls.

No timeline has been set on when legislation could reach the House floor, but Chrisan Herrod, a professor at the National Defense University, a joint military-educational facility, says she doesn’t expect Putnam to push a bill before the November election. Putnam says his timeline is “short,” but doesn’t define it. Meanwhile, groups such as the Information System Security Association (ISSA) are working to identify best-practices guidelines for corporations, and hope companies will adopt them out of self-interest.

But, if they don’t, “we reserve our right to legislate,” says Bob Dix, staff director of Putnam’s subcommittee. “What did it take to get corporate America motivated about Y2K? It took a Securities and Exchange Commission requirement to include a readiness statement in the annual report.”

Herrod, along with security experts such as Darwin John, the former chief information officer of the Federal Bureau of Investigation, see more regulation as inevitable. Why? Corporations aren’t going to voluntarily adopt best practices and revamp security systems when the returns on investment are murky.

At Fannie Mae, Herrod helped ensure that the mortgage company matched its business partners’ compliance with the Gramm-Leach-Bliley Act of 1999, which requires financial data privacy. At GlaxoSmithKline, her projects revolved around compliance with Food and Drug Administration rules. “The only reason I got any money to implement was regulation,” Herrod says.

Putnam’s effort is the latest to beef up the nation’s cybersecurity. President Clinton issued a directive on information security in 1998, outlining basic requirements such as antivirus protection and authentification. President Bush followed up with a plan that urged a public-private partnership to secure the Internet. That plan, penned in 2003 by Richard Clarke, former special advisor to the president for cyberspace security, has had little impact so far.

Meanwhile, cybersecurity is getting worse. In the last six weeks, source code from Cisco Systems was leaked on the Internet, the Sasser worm wreaked havoc on corporate systems and Gartner reported that consumers lost $1.2 billion in 2003 due to “phishing attacks.”

Despite the lack of success from the government’s previous plans, security experts are taking Putnam’s legislation push seriously because Congress was able to pass FISMA two years ago. Why not expand a cybersecurity edict to the private sector? “Ultimately the government is going to have to stand up and have clear requirements,” says AMR analyst Lance Travis, adding that the private sector is unlikely to follow information best practices in unison because of costs.

Clarke says he doesn’t favor additional regulation to govern cybersecurity, but would like current mandates to be more specific on information security. He also advocates a series of steps—avoid software vendors with insecure applications, require two-factor authentication, benchmark the security of applications, diversify software vendors, and so on—that both the public and private sectors can take.

In any case, the clock is ticking. Recent cyber-attacks will only get worse unless the public and private sectors cooperate to beef up information security. One problem: Companies don’t consider their networks part of the national infrastructure. Since all networks are interconnected, however, technology executives need to realize that their corporate networks could easily become a staging area for a cyberterrorism attack.

“What we see today is the tip of the iceberg of what could happen if a terrorist set out to do something,” says Clarke. “As long as [an attack] is possible, you run the risk that somebody will do something more significant.”

So what can you do today to get ahead of a cybersecurity regulation?

For starters, track developments from Putnam’s subcommittee (http://reform.house.gov/TIPRC/). Under the draft’s key provisions, companies would be required to:

  • Perform a security audit to assess the risk of unauthorized access, disruption, modification and destruction of information and information systems.
  • Investigate cyber-risk insurance. Putnam says the insurance industry should cut prices for companies that meet best practices.
  • Take an inventory of critical infrastructure assets such as stray routers, servers and areas where there’s easy access to networks. Herron says inventory is the most underrated security chore.
  • Develop risk mitigation, incident response and business continuity plans, and test these procedures quarterly to annually, depending on best practices for each area.
  • Submit to an information security audit by an independent third party.

    Four of these five practices are considered by security experts to be no-brainers. The final one—an information security audit—could be stickier. For starters, it’s unclear whether a newly created or existing agency would oversee the audits. Putnam’s draft puts information security verification under the SEC, but analysts such as Forrester Research’s Michael Rasmussen say such monitoring is “out of scope” for the agency.

    According to David Peyton, director of technology policy for NAM, the biggest issue surrounding any cybersecurity legislation is the lack of generally accepted practices. “Computer security audits are 80 to 90 years behind financial audits,” Peyton says.

    Some of the minimal best practices listed by ISSA include setting up a security policy with baseline expectations for security procedures and guidelines, establishing accountability for information access, cataloging types of information and correlating the level of risk with the value of the data.