Data Concentrator for Space Instruments

A scalable architecture for astroparticle and space-weather missions

About the Project

Modern space missions for astroparticle physics and space weather monitoring require robust and flexible data concentrator systems. Designed to interface with heterogeneous subdetectors, this architecture enables real-time event handling, synchronized acquisition, and reliable communication with onboard computers.

  • Date

    May 2025

Nuclear Instruments

Nuclear Instruments

GSSI - Gran Sasso Science Institute

GSSI - Gran Sasso Science Institute

INFN - Istituto Nazionale di Fisica Nucleare

INFN - Istituto Nazionale di Fisica Nucleare

Dèpartment de Physique Nuclèaire et Corpuscolaire - Université de Genève

Dèpartment de Physique Nuclèaire et Corpuscolaire - Université de Genève

Dipartimento di Fisica - Universitá di Trento

Dipartimento di Fisica - Universitá di Trento

Gran Sasso Tech, L’Aquila

Gran Sasso Tech, L’Aquila

Thales Alenia Space

Thales Alenia Space

Introduction: Data Acquisition in Space Missions

Modern space-based astroparticle experiments rely on tightly integrated data acquisition systems capable of handling diverse detector types, high event rates, and stringent reliability requirements. Central to these systems is the Data Concentrator Board—a key element responsible for merging detector data, applying first-level trigger logic, and coordinating with the satellite’s On-Board Computer (OBC).

This architecture is designed to support missions targeting cosmic rays, gamma rays, neutrinos, or space weather phenomena, offering a modular and reconfigurable platform applicable across a variety of orbital observatories, including missions like NUSES and CristalEye.

Our Contribution: Radiation-Hardened Hardware, Firmware and Protocol Stack

Nuclear Instruments led the design, validation, and full-stack development of the Data Concentrator used in the NUSES satellite, ensuring it meets the stringent requirements of orbital missions. Our key contributions include:

  • Design of the hardware platform, with emphasis on radiation tolerance and anti-latchup protection mechanisms for critical components
  • Development of firmware modules for event building, triggering, buffering, and telemetry—including all scientific and housekeeping data paths
  • Implementation of the full acquisition software stack, managing detector interfaces, command parsing, status polling, and data integrity
  • Integration of a CAN Bus interface toward the spacecraft’s On-Board Computer (OBC) for slow control, configuration, and remote diagnostics
  • Development of the HSSL communication protocol, an evolution of SpaceWire, supporting high-throughput transmission of scientific data to the satellite’s mass memory
  • Collaboration with mission partners to ensure full electrical, mechanical, and protocol compatibility with the Zirè and Terzina payloads

Data Concentrator Board

The Data Concentrator acts as the digital backbone of the instrument’s electronics, collecting data from all detector subsystems, managing synchronization, applying selection logic, and ensuring safe and efficient communication with the spacecraft.

PCB layout of the concentrator board

PCB layout of the concentrator board

Photo of the prototype of the concentrator board

Photo of the prototype of the concentrator board

Hardware Architecture

The board is based on a Xilinx Zynq Ultrascale+ FPGA, chosen for its high I/O density, embedded ARM processors, and robust transceiver capabilities. It supports multiple QSPI interfaces to simultaneously manage communications with several DAQ boards—whether tracking detectors, calorimeters, or scintillators.

Key features include:

  • SerDes-based HSSL (High-Speed Serial Links) for science data uplink to the satellite.
  • Dual CAN Bus transceivers for slow control, redundancy management, and firmware updates.
  • Hot-swap compatibility and dual-slot support for main and backup concentrator units with interlock selection logic.

Firmware Structure

The FPGA hosts a modular and upgradeable firmware stack, which includes:

  • Level-1 Trigger Logic: Aggregates and filters trigger primitives from multiple subdetectors based on user-defined rules and coincidences.
  • Event Builder: Combines data fragments into structured, timestamped events with global headers and error checking.
  • Housekeeping Polling Engine: Periodically reads sensor and status data from DAQ boards and power systems for telemetry.
  • Buffer Manager: Handles FIFO queues and arbitration to maintain throughput during periods of high event activity.
  • Data Transmission Engine: Implements custom protocols with framing, CRC, and synchronization for reliable HSSL and CAN communication.

Logical architecture of the concentrator system

Logical architecture of the concentrator system

Communication with the Satellite

The concentrator handles all interactions with the spacecraft’s central control system:

  • Science Data Uplink: High-throughput streaming via HSSL/SerDes with support for priority management and burst handling during transient events.
  • Slow Control Interface: Bi-directional communication over CAN Bus for configuration, health monitoring, and system commands.
  • Redundancy Management: Hardware-level failover system with cold-spare logic and remote activation via spacecraft command lines.

The design also enables remote firmware reconfiguration, allowing mission operators to push logic updates, threshold adjustments, or trigger table changes after launch—an essential feature for long-duration missions in low or high Earth orbit.

Onboard Machine Learning with Vitis AI

To enhance the instrument’s onboard intelligence and reduce downlink bandwidth requirements, the Data Concentrator architecture integrates support for neural network inference using the Xilinx Deep Processing Unit (DPU) and Vitis AI framework. This allows real-time execution of optimized deep learning models directly on the embedded cores of the Zynq Ultrascale+ FPGA, without the need for additional external processors.

Gamma-Ray Burst Direction Reconstruction

For missions targeting high-energy transients, such as gamma-ray bursts (GRBs), convolutional neural networks (CNNs) are deployed to estimate the direction of arrival of incoming events based on hit patterns and energy deposition in the calorimeter and tracking subsystems. These models, trained offline and quantized using Vitis AI tools, are executed onboard to trigger follow-up actions, such as region-of-interest (ROI) tagging or prioritized data storage.

Signal Classification for Neutrino Detection (Terzina)

In the Terzina payload, dedicated neural networks are employed to identify and classify neutrino-induced signals within large volumes of background noise. Using inference pipelines optimized for low-latency execution, the DPU enables real-time background suppression and event tagging, increasing the scientific yield while respecting telemetry constraints.

The integration of AI-based data filtering and event selection represents a significant step toward the use of edge intelligence in space instrumentation, allowing more autonomous, selective, and adaptive acquisition strategies—especially important for missions targeting rare or weak signatures in noisy environments.

CrystalEye and Zirè both will run AI engine to detect the direction of Gamma ray burst detection

CrystalEye and Zirè both will run AI engine to detect the direction of Gamma ray burst detection

Pictorial view of EAS detection by space-based ones like Terzina onboard NUSES.

Pictorial view of EAS detection by space-based ones like Terzina onboard NUSES.