In upcoming pilot, Navy to test delivery of new IT capabilities to ships within 24 hours

The Navy will test a new architecture that relies on web services and shared IT infrastructure in a combination of ashore and afloat clouds.


Best listening experience is on Chrome, Firefox or Safari. Subscribe to the On DoD podcast on Apple Podcasts or PodcastOne.

Starting this spring, the Navy plans to begin testing a concept that might eventually transform the way data is moved to and from its afloat vessels, the way its sailors use that data to operate underway and the way it develops new applications. Among other things, the approach could let the Navy deploy new software capabilities in under 24 hours, not the 18 month timeframe that’s commonplace today.

The pilot, called “Compile to combat in 24 hours,” will be based on web services and a new cloud architecture the Navy is testing, including a “micro cloud” aboard the vessel and a more robust commercial cloud ashore equipped with machine learning capabilities that can help “stage” and pre-package data before it’s synchronized with the ship.

In that way, instead of “pushing” endless gigabytes of information that may or may not be useful to commanders afloat (over precious and constrained satellite links), those commanders could order various types of data to be “pulled” at pre-set intervals of their choosing in an efficiently-compressed XML format, refreshing the data stores already available in their on-board cloud.

“What we’re trying to do is to break down the way applications and content are delivered to the fleet right now, which is very old school,” said Rear Adm. Danelle Barrett, the Navy’s chief information officer, in an interview for Federal News Radio’s On DoD. “It’s these sort of big monolithic applications that require a constant reach back to the shore, and when you lose that satellite link, you’re kind of dead in the water. So we want to break down the data from the application or the business logic or the presentation. And then you would use shared infrastructure afloat to store the data there, have most of your business logic already hosted on the ship and then use that information on the ship without having to reach back off every time to get more information. If you lost that satellite link, which is critical to us, you would still at that point have probably about 80 percent of what you needed to continue to operate, along with what you could bring in from organic sensors and UAVs and [tactical data links].”

For the pilot, the Navy plans to test the exchange of data for four different types of systems: logistics, hydrology and meteorology, personnel, and an unspecified combat system in order to prove that the concept works for various flavors of information.

It will rely on the existing CANES network for the shared shipboard infrastructure. For one thing, Barrett says, that’s what’s available on the Navy’s vessels today. For another, it wants to get out of the business of having to test, certify and carry around a new piece of hardware every time it wants to deploy a new software capability.

“If I construct applications in a different way, where I use the shared infrastructure of the ship, whatever that specific goal is, you’re just doing a web service, not a big monolithic application,” she said. “Then I can inherit all of the Risk Management Framework controls and all of the accreditation that’s already been done on that shared infrastructure. And if I use standard web ports and protocols, if I standardize on my data in an XML format, then what happens is I can automate the RMF process for the web service, because it becomes just testing that containerized piece of additional code — you don’t have go back and retest the environment. If I’m using standard development language, it can be a lot like how Apple says, ‘Hey, you develop this way, we’ll let your app into the store and you can get it out to users.’ We’re saying we should be able to do that within 24 hours — from compile to combat — if you do it right.”

Barrett said the fundamental architecture the Navy’s now trying to employ isn’t exactly new: it was first mapped out in the early 2000s, under the service’s vision of a “Web Enabled Navy” (indeed, then-Lt. Cmdr. Barrett wrote about it during that era for the Navy department’s internal IT magazine, CHIPS).

“What is new is people’s ability to understand this kind of architecture because of how micro web services are done in the commercial world and in the smart phone in your pocket, frankly,” she said. “But also the technology and the standards have come a long way. For example, The Navy Postgraduate School has pushed the efficient XML piece and gotten it approved by the World Wide Web Consortium. So it’s a perfect storm of opportunity, a convergence of technology and standards and processes to be able to do this now. And what I’m hoping to learn is that if this architecture is right, then this is the architecture we could use across the whole enterprise. If it will work end-to-end in our lowest common denominator, which is the shipboard environment because of the bandwidth issues and where you can be disconnected at any time, then it can work anywhere. If it does, that’s how we should do all of our content delivery, afloat and ashore.”

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories

    Navy pivots to ‘decentralized’ approach to cloud migration, with new brokers throughout its organization

    Read more
    cybersecurity

    Navy taking new steps to help commanders understand cyber risks

    Read more