Skip to content

PRISM: Privacy-preserving Inter-Site MRI Harmonization via Disentangled Representation Learning (Accepted for Oral Presentation at ISBI 2025!)

License

Notifications You must be signed in to change notification settings

saranggalada/PRISM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PRISM: Privacy-preserving Inter-Site MRI Harmonization via Disentangled Representation Learning

  • Paper: arXiv preprint (Accepted to ISBI 2025)
  • Cite:
    @misc{galada2024prismprivacypreservingintersitemri,
        title={PRISM: Privacy-preserving Inter-Site MRI Harmonization via Disentangled Representation Learning}, 
        author={Sarang Galada and Tanurima Halder and Kunal Deo and Ram P Krish and Kshitij Jadhav},
        year={2024},
        eprint={2411.06513},
        archivePrefix={arXiv},
        primaryClass={eess.IV},
        url={https://arxiv.org/abs/2411.06513},
    }

Usage:

  1. Clone / download this repository, navigate to the code folder and pip install requirements.txt
  2. Skull strip your MRI volumes using FreeSurfer / FSL software (aka Brain extraction). [If you don't have MRI data, the openly available IXI Dataset is the easiest option.]
  3. Run MRI-Slicer.py on the stripped MRI volumes. [Note: In some cases, where the MRI volumes are stored in different orientations, you may need to manually reorient the data]
  4. Run folder2dataset.py to generate separate custom MRI datasets (incl. augmentations) for each site.
  5. To train the PRISM model on each site, follow the PRISM-training.ipynb notebook. Alternatively, run train.py. Repeat the training procedure for each participating site.
  6. With the models now trained, follow the PRISM-inference.ipynb notebook to harmonize the MRI without any data exchange, as per the PRISM framework. Alternatively, run harmonize.py.

(More coming soon)


Architecture:


Results:

--

About

PRISM: Privacy-preserving Inter-Site MRI Harmonization via Disentangled Representation Learning (Accepted for Oral Presentation at ISBI 2025!)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published