Smart Data:
How AI Makes Every Bit Count
When your satellite beams back its payload, it’s not always delivering pure gold. Sometimes it’s delivering pure clutter. Cloud-covered images, redundant frames, or noisy sensor readings can swamp ground stations and waste precious bandwidth. But recent advances in onboard AI are changing that. Instead of sending everything back and filtering it later, tomorrow’s spacecraft will sort it in orbit, elevating the signal, and discarding the noise before it ever leaves the sky.
Engineers and scientists determine what’s valuable, while AI takes on the repetitive work of sorting and filtering. By clearing away the noise, it gives people more time to explore, analyze, and uncover new discoveries within the data.
The Data Deluge & Why It Matters
Modern Earth observation and planetary missions generate massive volumes of data. Hyperspectral, multispectral, thermal, and radar instruments each produce streams of imagery and sensor readings that can easily saturate a satellite’s downlink capacity. In many missions, a large percentage of the imagery is cloud-obscured or otherwise of limited value, especially in optical and Earth observation missions. For small satellites, transmitting all of the data is neither viable nor an efficient use of limited and expensive resources.
Historically, ground teams would download as much as possible and then sort manually in a time-consuming, bandwidth-inefficient process. The result: delayed insights, higher operational cost, and lost opportunity for rapidly evolving events.
Filtering Upfront: Onboard AI in Action
A pioneering example of AI use in practice is ESA’s Φ-Sat-1 (PhiSat-1) mission. Φ-Sat-1 carries a hyperspectral and thermal payload (HyperScout-2) and an onboard Intel Movidius Myriad 2 chip. Its in-orbit AI discriminates between images that are cloud-obscured and those that are usable. Only the latter get flagged for downlink, reducing the volume of data sent to Earth and improving efficiency.
In one project associated with Φ-Sat-1, researchers developed a Convolutional Neural Network (CNN) model called CloudScout that runs on low-power embedded hardware tasked to filter out images with more than 70 percent cloud cover.
On the NASA side, the Autonomous Sciencecraft Experiment (ASE) onboard EO-1 (Earth Observing-1) mission is an early but powerful example of onboard data triage. ASE software detects transient science events (like volcanic activity or flooding), prioritizes them, and instructs the spacecraft to retarget or collect follow-up data onboard before the downlink. In doing so, it also rejects or deprioritizes lower-value imagery, reducing the burden on ground systems.
These examples are real systems already flying or tested, and they show the power of AI to clean data onboard the vehicle before downlink enabling human analysts receive substance and less noise.
How the Human-AI Partnership Works
The intention here is not to let machines decide everything but to let them handle tedious triage so humans can concentrate on the meaningful parts. Here’s how the partnership functions:
- Human Sets the Rules
Mission designers determine thresholds (e.g. cloud cover limit, change thresholds, anomaly tolerances) and train models accordingly. AI is configured to follow human priorities. - AI Performs Filtering in Orbit
The onboard model evaluates each image or sensor frame based on those rules. When images or data don’t meet standards, they are culled before transmission. - Flagging & Escalation
If uncertain or borderline data appear, the AI can tag them for human review. It can also request a re-observation or alternate framing. - Ground Oversight & Upgrades
New model weights or parameters can be uplinked as the mission evolves, giving humans final control over classification criteria.
This approach ensures that human judgment remains central. AI reduces the load, not the authority.
Commercial & Mission Benefits
- Reduced Bandwidth Costs & Faster Delivery
Less data to downlink means lower transmission costs and quicker delivery of mission-critical insights to users on Earth. - Operational Efficiency & Scalability
A single AI-equipped satellite can replace multiple ground-sorting workflows and scale more easily across constellations. - New Business Models
Operators can offer “insights-as-a-service” rather than raw images. Clients prefer clean, meaningful data rather than wading through petabytes themselves. - More relevant and timely data
By adapting to real-time conditions, AI enables satellites to capture the most useful information exactly when it’s needed. - Extended Mission Life & Resilience
By reducing data load and giving spacecraft intelligence to manage its own pipeline, missions can last longer and adapt to anomalies with reduced human intervention.
Looking Ahead: Insight-Ready Missions
The next evolution is spacecraft that not only filter but interpret and summarize data. Instead of sending raw images, they’ll send “insight packets”: compact, prioritized summaries ready for human or machine consumption.
Constellations may collaborate, refine results, and deliver only what matters. In doing so, AI strengthens human decision-making and turns raw data into insight.
About Second Stage:
SpaceCom’s Second Stage is a national initiative designed to accelerate emerging sectors within the commercial space industry. Built to spotlight high-growth areas and amplify innovation, Second Stage offers a multi-platform experience connecting industry professionals, startups, and decision-makers through curated content, events, and community-building.
From Sector Spotlights to exclusive publications, webinars, and regional activations, Second Stage creates new entry points into the space economy. Each feature focuses on real-world solutions, forward-looking technologies, and the people behind the momentum offering fresh insights and practical pathways for growth.
Power Hungry
Orbital Data Centers
Smart data