Authentication and Identity Management

Introduction

More than 20 years after publication of the dog-on-the-internet cartoon, identity verification and authentication still present major challenges to large organizations, including federal agencies.

If anything, these challenges have become more difficult. That’s attributable to several factors.

• Organizations are simply doing more – and more critical – functions online. A far larger percentage of transactions, both external and internal, take place in a paperless environment. Internally, for instance, functions such as human resources, finance, contract management, and case management have mostly gone online. Externally, there’s exponentially more online interaction with constituents such as tax filers, Social Security beneficiaries and regulated entities.
• Ubiquitous computing has taken hold, with users accessing agency resources from all manner of mobile devices powered by a variety of operating systems.
• Agencies are operating more hybrid environments, with virtual workloads spread across in-house and external cloud data centers that might be geographically dispersed and accessed on more than one network.

Download our free ebook to find out how agency CIOs and CHCOs implementing the president's reorganization executive order.

It all points to the need for an updated approach to identity and access management that stays flexible, incorporates mobility and cloud computing, and takes the user need-to-know model to a much more granular level.

What’s at stake

Overlaying the new computing environment is the increasing sophistication of cyber criminals. Motivated by money, the desire to undermine national security, or an ideology, they realize the extent and richness of data held by the federal government. These out-of-band actors may be in China or an Eastern European nation, beyond the reach of federal law enforcement. Or they might be secretly disgruntled employees down the hall.

Clearly, new thinking is required for how to best apply some of the tried-and-true technologies and strategies for user identification and authentication. Passwords, multi-factor authentication, public-key encryption – they all have their place. But agency security and IT staffs need to make sure they’re deployed in the best way and using the most up-to-date iterations. All without driving users nuts, or their productivity to zero.

At a recent roundtable discussion hosted by Federal News Radio and sponsored by RSA and Advanced Computer Concepts, federal panelists acknowledged the criticality of federal information and the need to safeguard it. Panelists were:

  • Kirit Amin, chief information officer of the International Trade Commission;
  • Chris Boehnen, program manager at the Intelligence Advanced Research Projects Activity, Office of the Director of National Intelligence;
  • Kevin Brownstein, general manager for Federal Systems Engineering at RSA;
  • Tom Clancy, identity and access management lead for the Defense Department’s CIO office;
  • Zach Goldstein, CIO of the National Oceanic and Atmospheric Administration;
  • Curtis Levinson, U.S. cyber defense advisor to NATO;
  • Stephen Ellis, public sector marketing leader for RSA.

For federal agencies, an important goal of cybersecurity activities in general is aimed at making critical data both safe and accessible. Rare is the situation where data gathered is kept and used by only a single agency.

CIO shakeup at Treasury sign of similar moves at other agencies?

“Life and property are at stake in our systems,” says NOAA’s Goldstein. His challenges center on maintaining data integrity in times of emergency, giving access to legitimate first and second responders while keeping data safe from alteration. For example, denial of service or alteration might hinder a tsunami warning.

In NATO’s case, the security of 28 nations – with varying degrees of technical sophistication – depends on systems availability and integrity. Its crucial networks are air-gapped from the internet, Levinson says.

Even at small agencies like the ITC, Amin says, management needs to get with an up-to-date cybersecurity program. Even though ITC is not a cabinet-level entity, he says it was important to him on arrival there to ensure the commission adopted the standards of Homeland Security Presidential Directive 12. A new version of the nation’s harmonized tariff system incorporates embedded two-factor login authentication.

A modern approach to two-factor

But what of two-factor authentication? The efficacy of it depends on what the two factors actually are. As Clancy points out, some factors may not be subject to guessing by cracking programs, but they carry the danger of what he called “replayability.” That is, if a fingerprint or other biometric is reduced to a hash, though it retains its uniqueness it is nevertheless subject to potential exfiltration just as any stored data.

“If you can grab the hash, you can do whatever you want,” Clancy says.

That can be a worse problem for a user and the organization because while you can change a password, you can’t change a fingerprint or iris pattern. Thus a better place to start, he says, is with a public key encryption set-up combined with a PIN or password, as in the case of the DoD’s common access card to yield a cryptologically-unique authentication each time. Clancy stresses the importance of an external, third party PKI to better ensure interoperability within the federal government and with external partners.

Boehnen of IARPA points out that any user ID and authentication system must balance usability with security. For example, the still-academic field of homomorphic encryption, in which data is encrypted not only at rest and in motion on the network, but also when used by a computational process, can potentially solve the exfiltration threat and keep cloud-executed processes private. But its computational intensiveness makes it a practical impossibility for now.

He says agencies should view identity and access management through a three-way lens – people, access control and monitoring. That suggests techniques such as presentation attack detection (is it a real fingerprint and not a Gummy Bear?), monitoring abnormal network behavior more carefully, and otherwise encrypting data to limit wholesale losses from successful breaches. Agencies must also consider more fine-grained, hierarchical approaches to user access than most typically apply now.

RSA’s Brownstein says the SecurID Access product – its hardware token-based two factor authentication system – remains relevant and in wide use but continues to evolve. The company’s SecurID Suite product family builds on this with features designed for mobility. For example, at the lower-security end, a mobile user might use her password, token second factor, and then a shake of the device to indicate the person is in fact in possession of the smart phone.

Brownstein explains that the process of authentication step-up can, for example, require the user to state the one-time password aloud alongside entering it. Location services and geographic information can also strengthen authentication, flagging conditions such as a user password entered in a different location than the mobile device.

It’s also important for agencies to consider the lifecycles and governance models for credentials in order to maintain security. Lifecycle management asks questions such as, if an individual moves from engineering into management, does he or she need the same access to detailed drawings or other raw data? The governance approach permits auditing of user activities to determine whether they fit changing roles and responsibilities.

Cloud and identity, authentication

Cloud computing presents special challenges to federal agencies operating under a cloud-first mandate.

NOAA’s Goldstein says that, besides applications, information services such as mobile device management are moving to the cloud. That makes systems operators nervous. To some extent, service level agreements and FedRAMP certification mitigate data loss and improper access worries. But Goldstein and others advise retaining ID and access management and their related applications in house. Alternately, security staff should look into obtaining from cloud service providers logon, access, and data flow logs pertaining to their particular cloud instances, all in real time on a dashboard. That’s the only way to understand user behaviors and whether credentials are employed correctly while you can do something about it.

Clancy points out the need for a common, normalized identity and access architecture across agency and cloud data centers. User access methodologies should remain constant regardless of the resources users access or where it is. And they advise offering users single-sign-on, hiding the complexity of what are often multiple application instances in the cloud. If authentication becomes too difficult, users tend to devise dangerous work-arounds.

Where to go next

Beyond location and cloud versus in-house, better ID and authentication management will come from an updated approach to user accounts and improving the authorization process. Even if a person is ID’d with a high degree confidence, today’s processes remain weak when it comes to knowing whether the person’s current attributes justify the authorization. People may incur arrest warrants, lose their security clearance or have financial hardship, none of which is likely known to the authorization system. A more-solid system would know whether people have completed privacy and security training.

No perfect identification and authentication system exists. But by continuously monitoring network activity, keeping up with the latest technologies in multi-factor authentication, and adjusting methodologies to get past basic account- and role-based authentication, agencies can make serious progress in protecting their all-important data.