The Pentagon’s first round of formal testing on the military’s new electronic health record has concluded that the $4.3 billion system is “neither operationally effective nor operationally suitable” for any further deployments across DoD’s global health care system, at least for the time being, Defense officials disclosed on Friday.
The findings came in April 30 report, delivered by the department’s independent Director of Operational Test and Evaluation (DOT&E). It pointed to dozens of problems involving cybersecurity, long login times, inadequate user training and concerns about whether the system, MHS Genesis, can effectively scale to serve the more than 600 military treatment facilities around the world.
The critiques follow an earlier, preliminary assessment by the testing office, which said that users at all four of Pacific Northwest sites where Genesis has been fielded thus far rated it “low” for usability and that the system contained hundreds of cybersecurity vulnerabilities.
In a statement, DOT&E also suggested the problems are somewhat understandable, considering the scale and complexity of the project.
“The program management office continues to address this complexity with a robust test program, and has rapidly incorporated lessons learned from testing,” test officials said. “The PMO has developed a strong relationship with the users of MHS Genesis by setting up a senior board, with representatives from all of the military services, to help identify ways to quickly improve functionality and usability. This strong, mutually supportive PMO-to-user relationship is a critical element that will enable the PMO to continue to improve operational effectiveness and suitability.”
However, the initial problems discovered by DoD’s operational testers at three of the four sites were significant enough to convince the program office to delay testing at Madigan Army Medical Center, the largest of the initial deployment locations, until Genesis officials and vendors could fix the issues that had already been uncovered.
Stacy Cummings, the department’s program officer for health care management systems, said testing will resume this spring, and that the initial operational test and evaluation process is expected to conclude by summer.
As of now, DoD does not expect the test results to impact its current schedule for Genesis deployments, which calls for the next round of deployments — at hospitals and clinics along the West Coast — to begin in 2019, with the full, worldwide deployment still scheduled to be finished by 2022.
“We’ve always had time built into our schedule to gather user feedback,” Cummings said. “Out of all the individual information that was gathered by the test community, I can’t think of a thing that surprised us, because we were so in lockstep with and communicating with our [initial operating capability] sites.”
And officials and rank-and-file users at those sites have been more than willing to offer feedback, to the point that requests for fixes have overwhelmed Genesis’s help desk.
At one point, the help desk had 14,000 separate unresolved trouble tickets in its queue. Although roughly 9,000 have since been resolved, 6,000 are still awaiting attention, said Col. Thomas Cantilina, the Defense Health Agency’s chief health informatics officer.
“We do agree that it overwhelmed them because we did not adequately design the flow of the tickets to get to the right person fast enough,” he said. “We’re literally going through each one of them to make sure we resolve the issue that the user was concerned about, and we’re hiring some expertise to help sort through those last 6,000 to make sure we can get it down to a close to zero as is reasonable.”
And Col. Michael Place, the commander of Madigan Army Medical Center — the largest of the first four sites currently using Genesis — argued that a large number of trouble tickets isn’t necessarily a bad thing, from the perspective of an organization that’s been put in charge of test-driving a brand new system.
“We take a certain perverse pleasure in finding all of the different things that need to be fixed and the ways we think system could be better,” he said. “As an IOC site, we think it’s our role to do that so we can inform the process. I think the responsiveness of the process to fix the problems is a better measure of the effectiveness of the program than the total number of trouble tickets.”