-
Notifications
You must be signed in to change notification settings - Fork 264
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Precision and recall metrics doesn't work with scikit-learn==1.6.0 #655
Comments
The same issue exists with f1_score and same workaround fixes issue |
@Cozmo25 could you tell me how one can patch this dependency locally, until a new version of the |
@rbelew you should be able to |
I've submitted a PR (#656) to fix this issue. The problem occurs because scikit-learn 1.6.0 sometimes returns float values instead of numpy arrays for single-value results. The PR modifies the return statement to handle both cases safely. Temporary solutions until the PR is merged:
The PR should fix this issue for precision, recall, and f1 metrics while maintaining compatibility with both scikit-learn 1.6.0 and earlier versions. |
Description
The metric "precision" and "recall" doesn't work with the latest
scikit-learn==1.6.0
, which may have API changes.At the time of writing this issue, the API changelog of scikit-learn hasn't updated, but it seems like
precision_score()
fromsklearn.metrics
has changed to returning float instead of its original type.Other metrics may have similar issues, these two are what I've encountered.
Temporary Fix
Downgrade
scikit-learn
to1.5.2
.Steps to Reproduce
scikit-learn==1.6.0
Error Message
The text was updated successfully, but these errors were encountered: