-
Notifications
You must be signed in to change notification settings - Fork 105
Questions about *checkpoints* and *pretrain weights* #82
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
check #9 |
Hello, have you solved this problem? I had the same problem. |
Dear Authours,
Many thanks for your efforts on this great work and releasing the code!
I have some questions that (I guess others may also be interested in):
Q1. how to load the pretrain weights correctly?
(1) I got the pretrain weight from: https://huggingface.co/jiayuanz3/MedSAM2_pretrain/tree/main .
(2) I modified your train_2d.py script for evaluation. Specifically, I retained only the validation part and removed the training sections.
(3) I loaded the pretrain weights with code:
Total score: 1.5096757411956787, IOU: 0.0159607185616769, DICE: 0.026624170053027436
.(4) I also tried with using the
args.pretrain
withoutnet.load_state_dict
in the above:I guess my results are incorrect. May I get any guidance from you?
Q2. Different SAM2 foundation size
Are the released pretrained weights
MedSAM2_pretrain.pth
suitable for different SAM2 foundation sizes? I noticed that you only included tiny and small in the code. Would it be feasible if I directly replace them with base or large variations?actually I tried to use
load_state_dict
forsmall
size but got incompatible parameter dimensions.The text was updated successfully, but these errors were encountered: