The Need for Speed
As agencies rush to adopt the latest technology tools, they’re leaving the traditional approach to information security behind.
As agencies rush to adopt the latest technology tools, they're leaving the traditional approach to information security behind.
Everywhere you look, there's evidence that the pace of innovation is picking up in the information technology industry. Smart phones, social media, cloud computing-all of these developments point to always-on connectivity and the need to pass around a mind-boggling amount of information among agencies and constituents.
Federal chief information officers are under more pressure than ever to use the latest devices, applications and features to help agencies accomplish their missions. More and more, time does not allow for the traditional model of defining requirements, running a pilot project and checking for security risks before full-scale deployment.
In many cases, speed-to-market has become the main driver of IT development. That means everything else-including security- is coming second. "Consumer technologies are outpacing the government's ability to adopt them with the policy and regulatory and security framework that's in place," says John Sindelar, client industry executive with HP Enterprise Services and a former General Services Administration official.
Sindelar says the Obama administration's push to adopt social media tools, including YouTube and Facebook, as well as shared platforms such as cloud computing, has gotten ahead of any discussion of the security implications of these technologies.
Given this reality, he advocates a more proactive approach to information security based on real-time situational awareness and performance metrics rather than compliance checklists or reports. "You have to be an advocate for front-end detection of intrusions. You need to have better training. You need to trust but verify," he says. "It really is a common-sense approach."
The idea that information security should take a back seat is controversial, given the nature and volume of hacker attacks agencies face each day. But many IT specialists are operating under a new paradigm in which they focus primarily on speed, while trying to mitigate security risks.
"Government needs to be much more agile, much more proactive and much flatter, and there are a host of factors that are pushing government to change in that fashion. The push for openness and transparency is one of those," says Alan Balutis, director and distinguished fellow with Cisco Business Solutions Group and the former CIO of the Commerce Department.
"But at the same time you have the need and desire to be more secure and, the better term is, more resilient," he adds. "With security, you can't keep everything out and be perfect. The issue is when something happens, how quickly can you come back and how quickly can you cope with it."
Information First
Some IT experts argue that the traditional approach to security doesn't fit the current technology environment.
"The problem is: How do you deal with security issues when you have rapid change and multiple decision points?" says Dan Mintz, chief operating officer at technology firm Powertek and the former CIO of the Transportation Department. "We need to rethink how we deal with security, because it's not working."
According to Mintz, when balancing between sharing data and protecting it, agencies must choose sharing. He points to the intelligence agencies' Intellipedia wiki application, an open platform that allows authorized users to add and change content.
"The power that the intel community gains by sharing information is more important than the danger-the incremental risk-of losing information," he says. "If you have to choose between information sharing and information protection, the organization that wants to survive will have to choose information sharing and figure out how to be secure in that environment."
Agencies must replace the old- fashioned view of after-the-fact security reviews with upfront risk analysis, Mintz says. "Security has always been secondary, and these new technological advances are making it more secondary," he says. "I didn't say security is unimportant. Security is very important. But you have to figure out how to be sufficiently secure when you look at information sharing first."
Agencies that don't figure out how to share and protect information at the same time risk being irrelevant, Mintz warns. "For governments to be economically competitive and allow their people to be successful, they have to allow information sharing," he says.
William J. Bosanko, director of the Information Security Oversight Office at the National Archives and Records Administration, agrees. "The way we are working with information is evolving at a faster pace than the policy involved," he says. "We need to find ways to make the process more lean and agile because very often it is an impediment. And it's broader than just security. From a records management policy perspective, we want to have the policy enable rather than hinder the work of government."
Bosanko says the federal government needs to modify records management policies so agencies can be quicker to adopt new technology. "It depends on the information at issue, but in general, I think if you look at this from the broadest possible perspective, there's greater harm to not advancing openness and greater accountability," he says.
The Archives, for example, is reviewing how agencies handle information characterized as Controlled Unclassified, so it can streamline and standardize the process of marking documents as sensitive. Agencies currently use 100 different markings to restrict information. Bosanko leads a task force that is developing a common framework to make it easier to release information.
Having so many markings "clearly is not efficient and clearly is not effective," Bosanko says. "The issue is how to change the policy across many agencies that have different procedures, habits and cultures . . . to come up with solutions that are scalable."
Bosanko recommends that agencies adopt a risk management approach to information sharing. They need to set a much higher bar for Classified information than they do for Controlled Unclassified information, he says. But they also need to be willing to accept an occasional leak. "You can't always achieve perfection," he says. "Moving toward that goal is better than being paralyzed by the need to address every possible scenario."
The Defense Department's approach, under which need-to-share has replaced need-to-know in the dissemination of information, should spread to civilian agencies, Bosanko argues. "The paradigm of a need-to-share kind of environment does introduce risk, but it's got to be balanced against the risk of not sharing information or not advancing technology," he says.
But in the absence of specific policies establishing that balance, government officials tend to be leery of sharing information. "The system is risk-averse," Bosanko says. "The line person in an agency is looking at all the security requirements that are out there and looking at all the situations where information went somewhere it wasn't supposed to, and they don't want to be responsible for something going somewhere it shouldn't. It does have a chilling effect."
Security First
Even in a world that emphasizes speed and information sharing, some IT experts say security must come first-just not in the traditional way.
"Right now, when a federal CIO says 'security,' it means he has to wait for a team of consultants to write a report before he can deploy a new application," says Alan Paller, director of research at the SANS Institute. "The report says a whole lot of boilerplate that makes the consultant between $50,000 and $5 million. It has no impact on the security of the system, but it slows everything down."
Paller says agencies need a new model, in which security is designed into applications first rather than tacked on at the end. "There is a way to bake security in at the beginning without causing it to delay deployment," he says. "In this way, security comes first. But security reports will always come second."
The trick, Paller notes, is to stop wasteful tasks such as gathering data and filing reports, which verify whether agencies are complying with federal information security requirements but don't necessarily show that their security systems work. "The security guy's job is to make the cost of security close to zero and the inconvenience negligible," Paller says. That rules out any discussion of trade-offs.
As agencies rapidly ramp up applications using open source code and out-of-the-box development tools, they must take security into account to avoid putting constituents' personal information at risk, says Ray Bjorklund, senior vice president at FedSources. "Security has to be extremely important," he says, arguing that security standards should be set at the beginning of the development process. Then operations teams can figure out the specific information that must be secured.
New Approach
All sides agree the federal government needs a new approach to security. The 2002 Federal Information Security Management Act sets basic standards for government networks and requires agencies to submit reports on their compliance with processes to check for system vulnerabilities. But many say it has become little more than a paper-pushing exercise and agencies should focus on developing mechanisms to detect and respond to attacks and metrics to measure the success of the response.
IT specialists back FISMA's goal, but not the way the regulation has been implemented.
"The conventional wisdom is getting away from FISMA being a policy, reporting and compliance type of thing," Balutis says. "It has not done what it should to actually increase security."
Balutis says security should be at the forefront, but that procedures must accommodate rapid development. Real-time security analysis is "a better way to achieve that and to provide some certification and assurance without being so consumed in paperwork, reporting and compliance that you don't have money at the end to do what you need to do to make yourself secure," he says.
The Obama administration and Congress are pushing for changes to FISMA that would require vendors to build security into federal IT systems at the front end. The 2010 Federal Information Security Amendments Act would reduce the reporting burden on agencies and require them to deploy automated tools to measure the vulnerability of their networks.
One fan of this change is federal Chief Information Officer Vivek Kundra.
"Recognizing that the best security is baked in to information technology investments and not added in separately or well after the investments have been deployed, OMB needs to determine where, in the life-cycle development of systems, agencies are spending their resources," he told the House Oversight and Government Reform Subcommittee on Government Management, Organization and Procurement in March.
Don't Blame the CIO
Regardless of where they stand in the debate over speed versus security, observers say CIOs should not be solely responsible for making trade-offs. Accountability, they say, should span job titles from top to bottom at agencies.
"Whether security can come second, that's up to the folks that are delivering the mission. It's not up to the tech community to evaluate the appropriate level of risk," says Sam Chun, cybersecurity director for HP Enterprise Services.
According to Chun, as agencies adopt open source code to rapidly develop applications, they need to balance the business risks caused by deploying an application slower against the consequences of accidentally leaking information. "Interoperability, cost, risk and mission value. You need to balance these four things," he says.
It's impossible to consider all the possibilities before launching a new application, says Tom Hughes, director of strategic services at Computer Sciences Corp. and a former CIO of the Social Security Administration. "If you wait until you address 100 percent of all the security risks, or even a high percentage of all the security risks, your project is not going to come online," he says.
Security breaches don't happen just with Web-based applications, Hughes notes. He points to an electronic application that allows citizens to apply for unemployment or disability benefits. How can the agency know the person is who he says he is? You could ask the same question about a paper form.
"One of the challenges with the Internet is the volume of transactions, the mass amounts of data involved," he says. "That's where it takes information security to another whole level." All top executives-not just the CIO-must understand the dangers, Hughes says.
"By putting some of this stuff online . . . there's always the risk of exhibiting a Social Security number accidentally," he says. "Agencies need to come up with new risk mitigation models to understand the trade-offs."
"You can't just blame the CIO and the IT people for IT security policy," Hughes says. "You have to push it down into the business."
Carolyn Duffy Marsan is a high-tech business reporter based in Indianapolis who has covered the federal IT market since 1987.
Intellipedia
The intelligence community has used this wiki-based system for sharing Top Secret, Secret and Sensitive-But-Unclassified information since a prototype was created in April 2006.
Time to market: 6 months
Rapid Access Computing
Environment
The Defense Information Systems
Agency upgraded its cloud-based RACE environment from a self-provisioned testing service for military developers to a production-ready environment that was released in October 2009.
Time to market: 15 months
Recovery.gov
The Obama administration's official Web site for tracking spending related to the 2009 American Recovery and Reinvestment Act went online in February 2009-the same day the law was signed.
Time to market: 1 month
Data.gov
The General Services Administration launched this collection of machine- readable government data available from executive branch agencies in May 2009.
Time to market: 4 months
Apps.gov
In September 2009, the General Services Administration launched this portal for purchasing cloud-based applications including business, productivity and social media applications.
Time to market: 9 months
Climate.gov
The National Oceanic and Atmospheric Administration unveiled in February 2010 a portal to display all of its data, products and services related to global climate change.
Time to market: 6 months
NEXT STORY: The Perfect Union