Communication Dans Un Congrès Année : 2026

Bringing NLP Explainability to Critical Sectors: A Case Study on NOTAMs in Aviation

Résumé

The aviation industry operates within a highly critical and regulated context, where safety and reliability are paramount. As Natural Language Processing (NLP) systems become increasingly integrated into such domains, ensuring their trustworthiness and transparency is essential. This paper addresses the importance of explainability (XAI) in critical sectors like aviation by studying NOTAMs (Notice to Airmen), a core component of aviation communication. We provide a comprehensive overview of XAI methods applied to NLP classification task, proposing a categorization framework tailored to practical needs in critical applications. We also propose a new method to create aggregated explanations from local attributions. Using real-world examples, we demonstrate how XAI can uncover biases in models and datasets, leading to actionable insights for improving both. This work highlights the role of XAI in building safer and more robust NLP systems for critical sectors and also shows that academic efforts must be pursued to achieve trust in models and XAI itself.

Fichier principal
Vignette du fichier
ERTS2026_paper_40.pdf (791.98 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Licence

Dates et versions

hal-05513839 , version 1 (16-02-2026)

Licence

Identifiants

Citer

Vincent Mussot, François Hoofd, Fanny Jourdan, Antonin Poché. Bringing NLP Explainability to Critical Sectors: A Case Study on NOTAMs in Aviation. 13th European Congress of Embedded Real Time Systems (ERTS), Feb 2026, Toulouse, France. ⟨10.82331/ERTS.2026.40⟩. ⟨hal-05513839⟩
40 Consultations
18 Téléchargements

Altmetric

Partager

  • More