Inspiration

Access to accurate medical diagnosis remains one of the biggest challenges in global healthcare.

This concern is not just observational—it has been formally highlighted in research by NITI Aayog, where the shortage of radiology expertise and the growing diagnostic burden were identified as critical gaps in India’s healthcare system.

At the same time, leading global institutions like Harvard University and AI healthcare companies such as Qure.ai have proposed AI-driven solutions for chest X-ray analysis.

However, these systems come with significant limitations:

--They operate as black-box models --They rely heavily on cloud-based infrastructure --They require continuous internet connectivity --They are often expensive and inaccessible in low-resource settings --This creates a paradox:

AI exists to solve the problem—yet remains out of reach for those who need it the most.

XRAYNET+ was inspired by this gap. We set out to build a system that is not only intelligent, but also transparent, affordable, and accessible, ensuring that AI truly serves its purpose in healthcare.

What it does

XRAYNET+ is an Explainable AI-powered chest X-ray analysis system designed to provide both diagnostic assistance and interpretability.

It enables:

--Detection of Tuberculosis, Pneumonia, COVID-19, and Normal cases --Probability-based classification --Visual explanation through Grad-CAM++ heatmaps --Human-like explanation using an integrated LLM layer --Unlike traditional AI systems, XRAYNET+ delivers:

Prediction + Visualization + Explanation

It not only answers “What is the condition?” but also “Where is the issue?” and “Why does it matter?”

How we built it

XRAYNET+ follows a hybrid AI architecture, combining computer vision with language intelligence.

  1. Data Processing Curated and cleaned chest X-ray datasets Applied normalization and augmentation Ensured class balance for robust training
  2. Deep Learning Model Framework: PyTorch Architecture: EfficientNet-B0 Task: Multi-class classification (4 categories) This provided an optimal balance between accuracy and efficiency, enabling real-world usability.

  3. Explainable AI Layer Implemented Grad-CAM++ Generated heatmaps highlighting decision regions This ensures:

“The model doesn’t just predict—it justifies.”

  1. LLM Integration Converts outputs into human-readable explanations Provides contextual and clinical reasoning Enables interactive understanding
  2. Deployment Strategy Unlike many existing solutions:

XRAYNET+ is lightweight and locally deployable Does not require constant internet connectivity Reduces dependency on expensive cloud infrastructure

Challenges we ran into

--Handling medical data variability and imbalance --Ensuring meaningful and accurate explainability --Translating model outputs into clinically relevant language --Balancing performance with deployability --Designing a system that remains ethical and non-diagnostic

Accomplishments that we're proud of

--Built a hybrid CNN + LLM system --Eliminated reliance on black-box decision-making --Designed a cost-effective and offline-capable solution --Addressed a problem recognized at both policy and global research levels --Created a system that aligns with real-world healthcare constraints Most importantly: We made AI not just intelligent—but understandable and accessible.

What we learned

Through XRAYNET+, we learned:

--Explainability is critical for adoption in healthcare --Technology must align with ground realities, not just innovation --AI solutions must be inclusive and scalable --Trust is built not by predictions—but by clarity and transparency

What's next for XRAYNET+

We aim to expand XRAYNET+ into a comprehensive healthcare support system:

Extend to more diseases and datasets Integrate with hospital workflows Enable multi-modal imaging support Introduce real-time and voice-assisted interaction

“In a world where AI is becoming increasingly powerful, the real challenge is not intelligence— but trust.”

XRAYNET+ addresses this challenge by transforming AI from a black box into a transparent partner in healthcare.

Because the future of medicine is not just AI-driven— it is human-centered, explainable, and accessible.

Built With

Share this project:

Updates