Skip to content

cosbidev/XGeM

 
 

Repository files navigation

XGeM

Introduction

XGeM is an innovative framework designed to enhance multimodal medical data generation.
This repository contains the source code, pre-trained models, and usage instructions for XGeM. The goal is to provide an accessible platform for the scientific and clinical community, facilitating the integration of AI models into the diagnostic process.

Demo
HuggingFace
arXiv

Abstract

Artificial Intelligence is revolutionizing medical practice, enhancing diagnostic accuracy and healthcare delivery. However, its adaptation in medical settings still faces significant challenges, related to data availability and privacy constraints. Synthetic data has emerged as a promising solution to mitigate these issues, addressing data scarcity while preserving privacy. Recently, Latent Diffusion Models have emerged as a powerful tool for generating high-quality synthetic data. Meanwhile, the integration of different modalities has gained interest, emphasizing the need of models capable of handle multimodal medical data. Existing approaches struggle to integrate complementary information and lack the ability to generate modalities simultaneously. To address this challenge, we present XGeM, a 6.77billion-parameter model, designed for multimodal medical data generation, that, following Foundation Model paradigm, exploits contrastive learning and large quantity of data to build a shared latent space which capture the relationships between different data modalities. Further, we introduce the Multi-Prompt training technique, which significantly boosts XGeM’s generation under different settings. We extensively validate XGeM: f irst we benchmark it against five competitors on the MIMIC-CXR dataset, a state-of-the-art dataset for Chest X-ray and radiological report generation. Secondly, we perform a Visual Turing Test with expert radiologists to assess the realism and clinical relevance of the generated data, ensuring alignment with real-world scenarios. Finally, we assess the utility of XGeM in addressing key challenges in the medical field, such as anonymization, data scarcity and imbalance learning. The results are promising, demonstrating the applicability of XGeM in medical contexts.

alt text

Installation

To install and set up XGeM, follow these steps:

git clone https://github.com/your-username/XGeM.git
cd XGeM
pip install -r requirements.txt

Download the Pretrained Weights

Download the Pretrained weights from here and place it in the Weights folder.

Demo Instructions

To run the demo, execute the demo_model.py script.
Due to data protection restrictions, real data cannot be shared. Instead, two synthetic images (Frontal.tiff and Lateral.tiff) are provided in the Examples folder.
The script performs inference on all possible combinations and saves the generated images in /Examples folder.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.6%
  • HTML 0.4%