Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Forecast Submission Evaluation #184

Open
AFg6K7h4fhy2 opened this issue Dec 17, 2024 · 1 comment · May be fixed by #189
Open

Forecast Submission Evaluation #184

AFg6K7h4fhy2 opened this issue Dec 17, 2024 · 1 comment · May be fixed by #189
Assignees
Labels
enhancement New feature or request

Comments

@AFg6K7h4fhy2
Copy link
Contributor

This issue covers evaluating evaluating and scoring forecast submissions by submitting teams, akin to the following vignette.

@AFg6K7h4fhy2 AFg6K7h4fhy2 added the enhancement New feature or request label Dec 17, 2024
@AFg6K7h4fhy2 AFg6K7h4fhy2 self-assigned this Dec 17, 2024
@AFg6K7h4fhy2 AFg6K7h4fhy2 linked a pull request Dec 18, 2024 that will close this issue
@AFg6K7h4fhy2 AFg6K7h4fhy2 linked a pull request Dec 18, 2024 that will close this issue
@AFg6K7h4fhy2
Copy link
Contributor Author

AFg6K7h4fhy2 commented Dec 19, 2024

This task can likely reuse some code or patterns from cfa-flu-eval, which is not public. Additionally, the repository pyrenew-hew likely has elements of scoring and evaluation that can be used here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant