Skip to content

SuperMedIntel/Uncertainty-Aware-Adapter

Repository files navigation

Uncertainty-Aware Adapter: Adapting Segment Anything Model (SAM) for Ambiguous Medical Image Segmentation

UA-SAM, is a project to finetune SAM with Uncertainty-aware Adapter for ambiguous medical image segmentation. We use probabilistic sampling to help SAM make reliable and diverse segmentations. It is a multi-rater method, elaborated on the paper Uncertainty-Aware Adapter: Adapting Segment Anything Model (SAM) for Ambiguous Medical Image Segmentation

A Quick Overview

framework

Datasets

1.REFUGE2 benchmark: you can access and download it in here or by hugging face.

2.LIDC-IDRI benchmark: the original dataset can be found by link. We use the version preprocessed, which you can download it in here.

3.QUBIQ benchmark: the formal web link is here.

Codes

Requirements

pip install -r requirement.txt

Run UA-SAM codes

It is simple to run UA-SAM with default setting. You need to donwload the SAM checkpoint to the ./pretrain. Point prompt for training and you can change the type of prompt when testing.

train:

python main.py --mode train --dataset refuge --dataset_path ./dataset/your_dataset_path --prompt_type point

test:

python main.py --mode val --dataset refuge --dataset_path ./dataset/your_dataset_path --prompt_type box
python main.py --mode val --dataset refuge --dataset_path ./dataset/your_dataset_path --prompt_type point

Cite

If you find this code useful in your research, please consider citing our paper:

@article{jiang2024uncertainty,
  title={Uncertainty-Aware Adapter: Adapting Segment Anything Model (SAM) for Ambiguous Medical Image Segmentation},
  author={Jiang, Mingzhou and Zhou, Jiaying and Wu, Junde and Wang, Tianyang and Jin, Yueming and Xu, Min},
  journal={arXiv preprint arXiv:2403.10931},
  year={2024}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages