Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Florence-2 phrase Grounding #101

Open
Hasanmog opened this issue Jan 3, 2025 · 2 comments
Open

Florence-2 phrase Grounding #101

Hasanmog opened this issue Jan 3, 2025 · 2 comments

Comments

@Hasanmog
Copy link

Hasanmog commented Jan 3, 2025

Any Florence-2 phrase grounding training script ?

@SkalskiP
Copy link
Collaborator

SkalskiP commented Jan 6, 2025

Hi @Hasanmog 👋🏻 Thank you so much for your interest in maestro. Not yet, unfortunately. I don't have any good dataset that I could use as an example. Do you have any ideas on what data or for what use case you would like to fine-tune?

@Hasanmog
Copy link
Author

Hasanmog commented Jan 6, 2025

Hello @SkalskiP 👋🏻.
I would suggest using datasets like RefCOCO, RefCOCO+, and RefCOCOg to adapt the model for general phrase grounding tasks. Once the model is adapted, parameter-efficient fine-tuning methods like LoRA can be employed for domain-specific tasks, which is a computationally feasible alternative to full fine-tuning.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants