Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

V1.5.0 dev #44

Merged
merged 7 commits into from
Aug 6, 2024
Merged

V1.5.0 dev #44

merged 7 commits into from
Aug 6, 2024

Conversation

jsilter
Copy link
Collaborator

@jsilter jsilter commented Jul 30, 2024

Add custom calibrator.

The calibrator mimics the predict_proba function of the scikit-learn calibrator exactly, which is all we need for inference. This way I can get rid of the scikit-learn dependency. Also add regression test for the calibrator(s).

Add "prob" output from Sybil (sigmoid of "logit") which will make downstream processing easier.

Switch Sybil class to using custom calibrator.

jsilter and others added 6 commits July 26, 2024 10:30
Avoid repeat processing.
Properly set --return-attentions when --write-attention-images is set.
The calibrator mimics the predict_proba function of the scikit-learn calibrator exactly, which is all we need for inference.  This way I can get rid of the scikit-learn dependency.

Add regression test for the calibrator(s).

Add "prob" output from Sybil (sigmoid of "logit") which will make downstream processing easier.

Switch Sybil class to using custom calibrator.
Set default URL to a dropbox location with v1.5.0 calibrators.
@jsilter jsilter requested a review from pgmikhael July 30, 2024 15:48
@pgmikhael pgmikhael merged commit 2b9c5fc into main Aug 6, 2024
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants