← Flecto
Palantir · AIPCon 9

Multi-Domain AI: The Future of Command and Control

Cameron Stanley, CDAO — Department of War

10 min 25 sec
Watch on YouTube ↗

Key Takeaways

5 insights from Cameron Stanley's AIPCon 9 keynote

Decision Advantage is the Third Offset

The US military's strategy for winning the 21st century: make better decisions faster than adversaries. AI is the enabling technology, not an end in itself.

AI in Warfighters' Hands Was the Wrong Hypothesis

Project Maven proved that deploying AI to individual operators without transforming the surrounding workflow only solved one link in the chain. The bottleneck was always the process.

The Decision-Centric Approach: 9 Questions

Before any AI deployment, define the decision, map the current process, identify required data, plan user interaction, set success metrics, and define the iteration plan. Workflow first; technology second.

Two Flywheels Must Turn Together

Technology development and process improvement are separate but coupled flywheels. Delivering the right technology at the wrong phase of process change — or vice versa — breaks the synergy.

From Hours to Minutes: The Kill Chain Compressed

Maven Smart System collapsed a targeting workflow requiring 8–9 separate systems into a single platform completing the kill chain in minutes. Seven years of integration work made this possible.

Talk Structure

A 10-minute walk through 10 years of AI in defense

0:00

The Third Offset: Decision Advantage

Project Maven began with a hypothesis in 2016: if we get AI to warfighters, we win. Secretary Carter's Third Offset strategy identified AI-driven decision advantage as the key. The Algorithm Warfare Cross-Functional Team was stood up to prove it.

Why This Section Matters

  • Decision advantage as doctrine — Secretary Carter's Third Offset named decision-making speed as the 21st-century military edge, making AI a strategic necessity, not just a tech experiment.
  • Hypothesis-first thinking — The AWCFT was launched to validate a theory: getting AI to warfighters wins. This scientific framing meant the team would be willing to reject the hypothesis when evidence demanded it.
  • Focus on perception, not cognition — The initial bet was on computer vision for UAV feeds — automating the most fatiguing, error-prone human task in ISR workflows before tackling decision processes.
THERE ARE NO SECRETS — AIPCon 9 stage backdrop
AIPCon 9 — "There Are No Secrets"
1:28

Project Maven in the Field

Computer vision models were deployed worldwide to automate ISR analysis from UAV feeds — detecting cars and people so analysts didn't have to stare at screens for 16 hours. The models were best-in-class and integrated everywhere possible.

What the Technology Actually Did

  • Best-in-class models, worst-in-class integration — The CV models were genuinely strong. The problem was that they solved only one link in a much longer decision chain, leaving the downstream workflow completely untouched.
  • ISR-ped as proof-of-concept — UAV surveillance was chosen because the human cost was measurable: 16-hour screen-staring shifts and fatigue-induced misses. Computer vision directly attacked this bottleneck.
  • Universal deployment masked the deeper failure — Because the technology was integrated everywhere, the underlying workflow failures were harder to see. Detection accuracy metrics masked downstream process collapse.
Palantir aerial surveillance intelligence platform showing facility overhead imagery
Palantir MSS — aerial surveillance integration
2:45

The Scarlet Dragon Revelation

Exercise with the 18th Airborne Corps exposed the real problem. The ops center was full of whiteboards and PowerPoint. AI detections were being pushed into workflows that humans constrained. The hypothesis was wrong — the bottleneck was workflow, not AI.

The Pivot Moment

  • Whiteboards in a digital age — The ops center image was the diagnostic: static pictures and PowerPoint alongside cutting-edge AI detections meant the workflow had not changed. AI output was being consumed by an entirely manual process.
  • One link ≠ the chain — Automating detection solved one node in the kill chain. Every other step — target board, COA generation, asset tasking, BDA — remained manual, fragmented across 8–9 systems.
  • Rejecting the hypothesis was the breakthrough — Rather than doubling down on the original bet, the team called the hypothesis wrong. This intellectual honesty enabled the pivot to the decision-centric framework.
SCARLETT DRAGON I — ops center photo showing soldiers working at laptops
Exercise Scarlet Dragon — Theater Operations Center showing the workflow bottleneck
3:38

CDAO 2.0: The Decision-Centric Approach

A new framework: 9 questions to ask before deploying any AI. Start with the decision, not the technology. Map data flows, user interaction, and success metrics. The goal: help decision-makers work better, not replace them.

The 9-Question Framework in Practice

  • Workflow before technology — The framework forces mapping the current decision process before touching AI. This prevents the Project Maven mistake of deploying technology into an unexamined, unchanged workflow.
  • Success metrics built in from day one — Questions on reduction in human input, measuring success, and the iteration plan ensure improvement is defined and measurable before any deployment begins.
  • Augmentation, not replacement — The framework's explicit goal is to "make decision-makers better" — a positioning choice that directly addresses the adoption barrier of senior leaders fearing AI will replace their judgment.
CDAO 2.0 Decision Centric Approach slide listing 9 questions
CDAO 2.0 — The 9-question Decision-Centric framework
5:33

The Dual Flywheel

Technology development and process improvement are two separate but interlocked flywheels. Synchronize them so technology arrives at the right phase of process change, and operational feedback improves technology at the right phase of development.

The Synchronization Insight

  • Two independent clocks must tick together — Technology development has its own cadence (sprints, releases). Process improvement has its own cadence (training, organizational change). Delivering the right technology at the wrong process phase wastes both efforts.
  • Operational feedback as fuel — The flywheel model requires that field operators feed observations back to technology developers at the right phase — not years later through requirements documents. The feedback loop is itself the product.
  • Most organizations only run one flywheel — Tech teams focus on software delivery; process teams focus on workflow. Coupling these — not just running them in parallel — is what creates compounding operational improvement.
Approach to Enterprise AI Application slide showing two coupled flywheels
Approach to Enterprise AI Application — two coupled flywheels
6:25

Maven Smart System: Live Demo

One platform integrates all data sources. A single visualization tool replaces 8–9 systems. The full targeting workflow — detection, target board, COA generation, asset tasking, BDA — runs end-to-end in minutes, not hours.

Seven Years Compressed Into Minutes

  • 8–9 systems → 1 platform — The MSS kill chain demo collapses geospatial intelligence, FMV analysis, target nomination, target board, COA generation, asset tasking, execution, and BDA into a single visual context. Each system handoff that previously required manual data re-entry is automated.
  • Hours → minutes, but the data ontology is the real foundation — The speed improvement required seven years of integration work: defining a common data ontology so every upstream detection speaks the same language as every downstream action system.
  • Human in the loop — but only for decisions — The system does not fire autonomously. It recommends: right-click to nominate, approve the AI top asset match, approve the COA. Cognitive load is reduced to yes/no at each decision gate, not search-and-fill across nine screens.
Palantir Maven Smart System geospatial map with threat cones and military symbols
Maven Smart System — multi-domain geospatial intelligence view

Visual Highlights

The Maven Smart System kill chain — from detection to battle damage assessment

CDAO 2.0 Decision Centric Approach slide listing 9 questions
Decision-Centric Approach — 9 Questions Before Any AI Deployment Stanley's framework: start with the decision, not the technology. Nine questions force workflow-first thinking before writing a single line of code.
Approach to Enterprise AI Application dual flywheel diagram
The Dual Flywheel — Technology + Process Must Turn Together Technology development and process improvement as two coupled flywheels. Operational feedback improves technology; improved technology enables better operations.
Palantir Gotham Woody Island satellite imagery with threat zones
MSS Step 1 Geospatial Target Identification (Woody Island) Palantir Gotham overlays threat zones on satellite imagery. Range rings and ballistic analysis tools integrate into a single canvas.
FMV feed with AI vehicle detection bounding boxes
MSS Step 2 AI Vehicle Detection via FMV Full Motion Video feed with real-time AI computer vision: vehicles detected and bounded with red tracking boxes. The system flags targets human analysts might miss after hours of footage.
Context menu with Nominate to board option on detected vehicle
MSS Step 3 One Click to Nominate a Target A right-click context menu on the detected vehicle reveals "Nominate to board" — a single action that initiates the formal targeting workflow without leaving the FMV view.
Palantir Target Board kanban with DELIBERATE DYNAMIC COMPLETE columns
MSS Step 4 Target Board (Kanban Workflow) Targets flow as cards across columns: DELIBERATE → DYNAMIC → PENDING PAIRING → PAIRED → IN EXECUTION → PENDING BDA → COMPLETE. The entire chain of command sees the same picture simultaneously.
AI Asset Tasking Recommender with battlefield map
MSS Step 5 AI Asset Tasking Recommender The system cross-references target requirements against available assets across the battlespace. AI proposes best-fit platforms — the operator approves, not searches.
AI optimization criteria dialog with AGM Match Time to Target Distance sliders
MSS Step 6 Configurable AI Optimization Criteria Commanders tune the AI recommender by weighting AGM Match, Time to Target, Distance, Fuel, and Munitions. "Continuous Optimization On" keeps recommendations live as the tactical picture changes.
STRYKER1 selected as top match asset for Computer Vision Detection target
MSS Step 7 STRYKER1 Selected as Top Match Top Match card: STRYKER1 (M2 .50 cal 750x) assigned to Computer Vision Detection target. Time to Target: 4m 23s. The entire selection took seconds.
Battle Damage Assessment panel showing Destroyed greater than 95 percent
MSS Step 8 Battle Damage Assessment: Destroyed (>95%) The BDA panel closes the kill chain: CA/PDA Destroyed >95%, FDA Functionally destroyed >95%, No collateral damage >95%. What previously required hours across 8-9 systems now completes in one platform.

Full Transcript

Full Transcript (10 min 25 sec) ▾

0:00 Good morning. I want to take you back to the year 2016 because that's really where the story of Project Maven began, even before the project stood up formally as the algorithm warfare cross-functional team — because it began with a hypothesis. What does the third offset look like?

0:17 If anybody knows about the offset strategies of the late 20th century, the first offset was basically how do we overcome differences in mass and scale that our adversaries have on top of us in order to wield a more effective military enterprise.

0:44 What we saw from the first offset, which was how do you use nuclear weapons to overcome that? That was the first technology offset. Second one was stealth and precision-guided munitions. Secretary Carter, when he was trying to come up with a third offset, said the real advantage that we're going to have in the 21st century is decision advantage. How do you get better decisions faster than your adversary? That's what wins wars.

1:13 The problem was in 2016, there were a lot of problems that we could look at for different types of technology, and AI was the one he wanted to focus on, particularly for Project Maven. The algorithm warfare cross-functional team was established to say, get AI in the hands of the warfighter, focusing on UAV-ped.

1:33 What do I mean by that? What's happening for our airborne surveillance assets so that humans don't have to stare at screens. They get tired, they get distracted, they miss things. What we ran into was, let's use AI to try and tip and cue the right systems so that humans didn't just have to stare at a screen 16 hours a day and get tired.

1:57 With that, this is where we started. The hypothesis was, if we got AI in the hands of the warfighter, it would work. We developed some of the best computer vision models possible. They were built on our data, deployed on our systems. Everywhere in the world that we could possibly integrate it, we were integrated. You could detect cars and people.

2:22 We fielded these systems quite robustly across the entire landscape. The problem wasn't ISR-ped per se. That was one problem. The bigger problem was the fact that our processes were not set up and our technology was not set up to use data-centric techniques in order to make better decisions faster. That just solved the problem of one individual in the entire chain.

2:44 The real problem we were trying to solve was this. This is an image from a theater operations center in Scarlet Dragon. Scarlet Dragon is an exercise that Project Maven ran with the 18th Airborne Corps. What you can see there is a bunch of static pictures. Whiteboards. Don't be distracted by the screen — that's just PowerPoint slides.

3:09 We literally were trying to push AI detections into workflows that humans were limiting on our outputs and our outcomes that we were trying to achieve. We rejected the hypothesis that getting the AI in the hands of the warfighter was the right answer. What we really needed to do was take three steps back and say the real issue isn't AI. The real issue is workflow. How are we making decisions? That's when we came up with the decision-centric approach.

3:40 The decision-centric approach — I came up with nine questions. There are other approaches out there. This is my easy one. Mainly because DARPA has their nine questions, I felt like I should have my nine. When you're thinking about trying to improve decision-making processes, you always have to start with the decision. You also have to look at how the decision is made today. What part of the process are you accelerating?

4:07 What data is required to make the decision? How is the data going to arrive? How will the user interact with the data? What's the reduction in human input? How are you measuring success? What's your iteration plan? By looking at these, it's pretty straightforward. Those of us in the data world, we do this intuitively every single day.

4:21 The challenge is getting senior leaders, especially those who are very competent, very professional, to understand that your job as a data professional is not to try and replace them. Your job is to make them better. And you make them better by giving them the data that they need, in the time, format, and capacity that they require in order to make decisions.

4:53 But there's another problem. That's one decision. And in complex workflows, as we know, there are dozens of decisions. The challenge isn't coming up with the right approach — we've got that. It's how do we get the user community to buy into that approach in order to have them completely digitally transform their entire workflows so that they can see how data-centric techniques allow them to solve all of those decisions simultaneously and get to real advantage on the battlefield.

5:33 So, what we did was — everybody in technology knows the left flywheel quite well. Technology development, that's pretty straightforward, right? Standard spiral development process, no one should be surprised by that. What we fail to recognize usually in the technology development space, at least in the department, is that there's a process improvement flywheel that's happening at the same time, ideally.

5:53 The question isn't how are you improving the process with the technology. It's how do you interlink and couple those two flywheels together so that you're delivering technology at the right phase in their process, and they're giving you the operational feedback in the right phase of your process so that you can have mutual synergistic improvements. It's not just the technology, it's also the process.

6:23 And what we found when we looked at the AI into different types of workflows is that the system was wrong. This is Maven Smart System — Palantir's software as a service product that we are deploying across the entire department. It's not just one data feed, it's multiple. And instead of having eight or nine systems for those decision makers to look at every single day, you fuse it into a single visualization tool.

6:35 The single visualization tool allows you to select, deselect different types of data, look at different approaches to data, but more importantly, action from the same system that you're trying to develop your workflows around. Once you have a detection that you want to move into a targeting workflow — left click, right click, left click — magically, it becomes a detection in the workflow.

7:20 That detection then gets moved into a workflow. Every single column produces a different type of decision-making process. Once you have that decision and you're trying to action that process, we move into COA generation — course of action generation — where we automatically identify what the best asset to prosecute a target looks like.

7:50 Once we've got the different approaches and we select one, we can move directly into how do we action that target. We've gone from identifying the target to coming up with a course of action to actioning that target, all from one system. This is revolutionary. We were having this done in about eight or nine systems, where humans were literally moving detections left and right to get to our desired instance — closing a kill chain.

8:20 When we started this, it literally took hours to do what you just saw there. Through a number of different deployments, we've been able to reduce that time significantly. All because of two things: our ability to integrate with our customer base directly and have them work with us in order to understand their process — and us developing the right technology.

8:50 Using an abstraction layer called Maven Smart System that connects all of those things with the right data approach, the right data ontology, and the right data formatting. This is not something that happened overnight. This took seven years to get here — not only from a data connections perspective, but also to connect each of those systems together.

9:10 So, where are we going from here? This is the thing that gets me up in the morning as the CDAO of the department. Every single tool that I get from my vendors, Palantir especially, updates with time. It's the first time in my career that I have a system that gets better day after day because we are integrated with our customers, listening to their feedback and giving them the tools they ask for — not what some requirements manager asked them to build five years ago.

9:40 I do this every single day because I care about one thing and one thing only. That 18, 19, or 20-year-old kid who had no choice in where he went or what threat he's facing — because I want him to win and come home. That's why we do it. Maven Smart System is an incredible system. No fair fights. If I can avoid it, let's not have fair fights. Our guys win and we come home. Thank you.