Originally published Data and integration will be ‘core’ of Army’s Next-Gen C2 on by https://defensescoop.com/2025/06/06/data-and-integration-will-be-core-of-armys-next-gen-c2/ at DefenseScoop
The Army is also looking at how much compute and storage is needed at the tactical edge.

ABERDEEN PROVING GROUND, Md. — A data and integration layer and data ingestion will be critical components for the Army’s next generation of command and control, according to top service officials.
Next Generation Command and Control is one of the Army’s highest priorities as it aims to provide commanders and units a new approach to manage information, data, and command and control with agile and software-based architectures.
Army officials have said NGC2 is composed of a horizontal operational design that involves a technology stack that goes from a transport layer to an integration layer to a data layer to an application layer, which is where soldiers interact with it. That application layer is also where the Army has broken down the silos of individual warfighting functions — such as intelligence or fires — into applications that ride on the same integrated backbone.
The integration layer is where streams of information — internal and external to what a unit generates — are fed, using artificial intelligence and machine learning for triage, into a more sanitized data layer.
“The data integration layer is the absolute core of this. That’s what’s going to make this work. We have to be able to integrate that data in a [denied, disrupted, intermittent, and limited], contested environment, against an enemy that’s going to be” sophisticated, Col. (P) Mike Kaloostian, the incoming director of the C2 cross functional team for Army Futures Command, said at the Army Technical Exchange Meeting at Aberdeen Proving Ground on May 30. “That’s the dilemma … If you’re not thinking through as industry partners what’s going on in Ukraine and how quickly they’re having to adapt to be able to try to gain an advantage. Think about where we are just the last 12 months. Start looking at what are they doing now.”
The Army tested what it deemed a successful prototype of the system in March at Project Convergence Capstone 5 at the National Training Center at Fort Irwin, California. It was the first experiment in the dirt with a unit in the classified network and was outfitted to a real battalion as well as higher headquarters elements.
Now, having validated the approach, the Army will continue to refine that prototype, scaling it to the division, while the program office works to award vendors for the official program of record.
Kaloostian added that if that integration layer is the core of NGC2, then the Army has to start thinking about assuring the usability, accessibility and security of that data for commanders.
Low latency and high capacity transport will be needed to assure that, he said.
“In order for us to be able to get the commercial and the government to synthesize those data feeds, make them so they’re sensible for the commander and usable for the commander immediately at the point of need, that’s going to require a transport that is extremely diverse and able to withstand what the enemy is trying to do to,” Kaloostian said.
The other piece of the data challenge is data ingestion and being able to organize it properly for commanders. Officials have maintained that the key to NGC2 is the ability for commanders to do “more, better, faster.”
“The data ingestion is something that I’m very, very focused on. That’s the secret sauce for Next Gen C2, like we’ve talked about, is how we pull in all of these disparate feeds and then there are different data streams and then [quality control] them and get them organized,” Maj. Gen. Patrick Ellis said at the technical exchange meeting on his last day as director of the C2 team.
The Army has to be able to effectively triage the volume of data units will have coming in or they will risk missing the opportunity to find a necessary valuable data for operations or overwhelm operators with data.
Intelligent and even autonomous networks will also be paramount to withstand the complex environment against sophisticated adversaries to ensure commanders have access to the data and capabilities they require.
That intelligent network must be able to adapt and understand what the enemy is doing, along with what friendly signatures look like.
“Fully autonomous network. That’s what we need to be moving towards: a software-defined network that is fully autonomous … then the ability to quickly process and synthesize data at the edge — edge compute will continue and will always be a part of our ecosystem because we’re going to need it,” Kaloostian said.
Compute and store
Following the Project Convergence experimentation, the Army is going to have to determine how much compute and storage it needs and at what levels.
The Army has shifted its thinking a bit on compute and storage over the last few years. At one point, government and industry leaders opined on deploying edge computing services as the buzzword of the day. Now, the service is beginning to take a slightly different view on who will need these technologies and how feasible it will be to deliver them, given the speed of war in the future.
Part of the reason for that is lessons from Ukraine regarding how contested the information environment is. In a congested and contested electromagnetic spectrum, the flow of data back and forth from the edge to the cloud will be extremely strained and limited. Thus, forces will need to figure out how much computing and data storage they’ll need at their level to be reliant on without having to pull from a central cloud, and then, once connectivity returns, how to plug it back and make sense of it.
“We’re going to have sensors everywhere in the battlefield. We saw it during NTC because everyone’s bringing their new capabilities out there. They all have sensors, they all have feeds and they all need some place to put their data. That’s Next Gen C2. That’s our data integration. We have to figure out, okay, if all those sensors are out there, how are we going to do it? You’re not bringing that data back to the cloud for analysis. It’s not going to happen,” Kaloostian said. “We need to be able to do that point of need and we need to share that effectively.”
Ellis said he’d like to see exercises where monitors are placed with each staff section’s systems so the Army can have better information on how much and what type of data they all need.
That way, they can say, “in an exercise, this is generally what data is that you’re using, so then that’ll help us understand how much we need to put where,” he said. “Do you need hours worth? Do you need days worth? Do you need lines of code worth? How much of that stuff you put down there? … How much do you really need and how much are commanders comfortable with? How much risk are they willing to assume?”
Then, the next challenge becomes: If forces start using more cloud-based capabilities such as voice-over-IP, and their devices become disconnected, what does the Army do?
“If everyone’s using a smart device and get disconnected, how do you continue to have a voice capability as part of your [primary, alternate, contingency and emergency communications] plan or put a server on the edge. I don’t think we’ve solved those problems yet,” Ellis said. “I think that’s going to be part of the fun as we start to prototype this out.”
Latest Podcasts
Originally published DefenseScoop