What is Data Fusion?
- John Moors

- Nov 18
- 3 min read
First and foremost, this is a big topic. Sensor Fusion also gets discussed in parallel or sometimes in the same place of "Data Fusion". So, demystifying requires some quick separate of terms.
From a broad sense, Data Fusion is the combination and optimization of both signals, formulas, and decision making to not only get a more accurate sense of the environment around, but what to do about it. The autopilot on a high-tech car, would, for example, be employing raw sensor data with equations for where it is in the world and what is around it. Using its own rules for how to behave safely on the road, it's fusing together what it needs to operate in a complex world.
Underneath Data Fusion are a few terms. For the sake of this article, we'll focus on three of them below. You’ll often see Sensor Fusion discussed alongside, or even interchangeably with, Data Fusion. Demystifying it starts with separating the terms.
From a broad sense, Data Fusion is the combination and optimization of signals, formulas, and decision-making. It doesn’t just provide a more accurate sense of the environment; it helps determine what to do about it.
Think of the autopilot on a modern vehicle: it fuses raw sensor data with positional equations and behavioral rules to operate safely in a complex world.
Under the umbrella of Data Fusion are several subtypes. Once that box is opened, it's easy to get a bit overwhelmed.
For the sake of this article, we’ll focus on three of them below.

Sensor Fusion
There are a ton of sensors out there, but for now let's focus on those common for flight testing. An accelerometer, a strain gage, a thermometer, and maybe an angular rate sensor are all feeding this data into the system as a whole. As our vehicle accelerates, strain increases on its wings, its main engine is heating up, and it's rotating downward as it heads towards its target.
As the vehicle accelerates, strain rises/falls across its wings, engines heat up, and attitude sensors register a downward pitch toward the target.
This stage, Sensor Fusion, focuses on synchronizing and aligning those raw measurements so they can tell a consistent story. And that's a key note here: the story. Whether it's told in 90minutes like a movie at your local cinema, or in microseconds between sensors and computers, the layers need to be cohesive and understandable.
Feature Fusion
Strain gage data is one thing, but many gages are telling us is likely important. Onboard computers may be calculating or reporting metrics on if we are pushing towards the tolerance of what this vehicle can handle. Maybe they're also reporting important ratios for fuel, EMI, or other exposure. These are more advanced answers, fed upstream for decision making.
For me, it also helps to think of this from a human intel ("HUMINT") perspective as well.
If you had a recon element low crawling through the underbrush to a ridge, then using binoculars to scan an enemy position, you'd hope to hear more than just "looks like they have defenses." That observer, out there in the thick of it, would also know how to answer the "so what" of the situation. Just like the strain gage being combined with known tolerances of a wing, what enemy forces seen through the binoculars indicate we have a tactical problem?
This gets us close to what you're likely guessing comes next: decisions.
Decision Fusion
This too can be a broad topic. But multiple decision-makers can be involved throughout any test or operation. There could be warnings that appear when certain tolerances are pushed, or perhaps Artificial Intelligence is providing some of the executive decisions or suggestions. A human-in-the-loop operator could be overseeing it all, making battle-level decisions or passing on the data to U.S. and coalition commanders.




Comments