Federal managers working on ‘culture change’ for cross-component data sharing

The Department of Health and Human Services, and the National Counterterrorism Center are two federal components balancing opportunities and limitations on big ...

Government data can help prosecutors communicate with a jury, or sharpen the details of a homegrown terrorism incident, and behind that storytelling are agency experts constantly looking for new ways to improve those narratives.

Speaking during a panel at the April 25 Cloudera Government Forum in Washington, Caryl Brzymialkiewicz credited her data analytics team with helping the Justice Department last year uncover a $1 billion Medicare scheme.

“Three co-conspirators had figured out how to try to hide in the data,” said Brzymialkiewicz, chief data officer at the Health and Human Services Department’s Office of Inspector General. “But it was really about using data analytics and partnering with DOJ and the FBI to uncover the money laundering, to understand in the data what was happening, really understand between the provider and all of the networks really what was going on.”

Brzymialkiewicz said she has several teams that work on data, one for data governance, qualities and operations, another team for planning and performance management, and also the advanced analytics team,

The $1 billion scam was enabled through access to CMS‘ system, she said, and the case was an example of how data can be looked at in different ways, but at the same time there needs to be a way to pull that data analysis together in a way to answer the “so what” of data, Brzymialkiewicz said.

“Making sure that you are expressing your data in a way that your customer can understand,” Brzymialkiewicz said. “People talk about data analytics, they’ll use that buzz phrase, they’ll talk about data visualization. If the team is pulling together all this robust analysis, and then we’re not figuring out how to enable our agents to talk to the prosecutors to then convince a jury, then we have failed them.”

At the National Counterterrorism Center’s Office of Data Strategy and Innovation, Director Vimesh Patel’s job is helping to pull together data and intelligence from other agencies, to paint a picture of potential terrorist threats.

“It’s never all in one place, it never will be for a variety of reasons,” Patel said. “So we have to look at solutions on how to work with that and still do it in a really smart way.”

Another tricky issue is that NCTC doesn’t own any of the data it receives, Patel said.

“We have this really complicated regulatory requirement and making sure that we are protecting this data and using it only for the right reasons,” Patel said. “That gets a little bit tough sometimes. That’s one of the big focuses of our data strategy; how can we do that.”

Patel said the center has evolved its strategy to “democratizing data” to make sure it’s available to analysts and that “they’re not afraid to use it.”

“When you overlay things like policy and accepted use of data, those aren’t really baked into systems today,” Patel said. “We have to figure out how to overlay that on top and the way we do that today is through a great, close collaboration and partnership across everyone in NCTC, but specifically my organization.”

Agencies can tout collaboration, but it’s not always easy to share data across components, said Jessica Kahn, director of the Data and Systems Group (DSG) in the Center for Medicaid and CHIP Services at the Department of Health and Human Services.

Kahn said she gives her data to Brzymialkiewicz because she wants the watchdog to “catch the bad guys.”

“That’s not a data discussion, that is a program policy discussion, about what we’re trying to do with our work,” Kahn said. “Therefore I give Caryl [Brzymialkiewicz] my data so that she can do that, and we talk about what’s there and not there. I’m not giving it to her just because she’s the OIG and she has the right to ask me.”

Kahn said CMS put all of its data in the cloud, including for the first time, protected health information (PHI).

“We put it in the cloud because I want people to use it,” Kahn said. “We separated the storage from the utilization discussion. We want people to use it. If it takes you a week to get a query, I’ve lost that policy analyst. She’ll never log back into my dashboard ever again. I need things that can move real-time very quickly, and with the size of data we’re talking about, from all of the states — just our friends in California give us 900 million records a month, never mind everybody else — we need to be able to move that rather quickly.”

Kahn said it would be “no small feat” to arrange a data transfer to pair up health care data with consumer and financial information.

“There are agencies that do this better than us, but we should have a data exchange first policy,” Kahn said. “That means my default position is if he needs the data, the Medicaid data, then I should default to yes. Then if it turns out that there’s privacy or security or other reasons that I can’t, then so be it. But that’s very different than the way it operates now, which is typically a default to no; there’s 100 reasons why that’s hard or I can’t do it, and that requires a culture change.”

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories

    Interior’s USGS turns to the cloud to make its earthquake app into a life-saving platform

    Read more
    : Erica Knight, National Geospatial-Intelligence Agency

    NGA’s West Coast base looking to set down roots with industry as it reaches new heights with satellite data

    Read more