Flight Tests - Overwhelming Data
- John Moors

- 3 days ago
- 2 min read
It seems like only a handful of years ago that when I spoke with flight test engineers and discussed their ideal sampling rate (how many data points to capture per second), the answer was “as few as possible”. Made sense. Run 10,000 samples a second for 10min, that’s not a tiny file. Run 300,000 SPS for 8 hours, and you’ve got yourself a pile. And if you’re using USB? Connecting a laptop out near the tarmac trying to download that data? You’re now the engineer holding the team up.
For my own career, I’ve almost always been pushing for more data, meaning more samples per second. I’ve had to argue my stance more than once on multiple teams. If someone on my team said, “we might have collected too much on that one”, I’d never complain.

But I’ve been hearing from industry veterans, and am now seeing more myself, a shift from minimalist data capture to borderline hoarding.
When the question comes up on a call “so how many samples a second would your team need to collect for this?” more and more often the answer is “as much as you can give us.” Or as one Skunkworks lead told me “You ask the engineer how many samples he wants, and he just says: ‘Yes”.”
To be fair, this isn't just capturing data for data's sake. There may be future questions we want to answer ¹ and this is at least one reason for casting such a wide net. Still, the ramifications are worth considering.
To choose sample rates based on today's questions is a bit simpler. There are best practices that can be learned and followed. Ratios between your bandwidth of interest and samples per second to achieve data fidelity. These are generous recommendations already, and pushing past them can exponentially grow your database volume.
This isn’t just a passing phase, either. I’ve talked with multiple vendors who are moving beyond their data acquisition and display solutions to now separate products for dealing with years of high rate, deeply complex data. Data that normal computers and software simply cannot handle. Many of the PC's we use for testing now cannot handle the data we've collected in those very tests.
We used to scramble to fill the gaps in our data. Now we’re buried under terabytes, and we need shovels to dig ourselves out.

How do we tackle this moving forward? It seems there are several tools being developed, which I plan to explore further, potentially in future articles.
Big Data Software
Machine Learning
Artificial Intelligence
Data Compression
¹ Klug, Christopher. Lessons Learned from Applying Data Analytics to Flight Test Data. International Test and Evaluation Association, May 2018.
Shout-Outs:
Much of my news digest comes from the below resources, and I would highly recommend them to anyone wanting to keep apprised of Aerospace & Defense news:
All statements in this article are the sole opinions of the original author and do not represent the stance or perspective of any associated companies or groups.
Any mention of specific products is not an endorsement but rather an illustrative example. Please do your own research and conduct due diligence before pursuing any solution mentioned on this site.




Comments