Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix size attribute error for precision/recall/f1 #656

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

Maxwell-Jia
Copy link

What does this PR do?

Fix #655 .
Fix AttributeError in precision/recall/f1 metrics when handling scalar outputs from scikit-learn 1.6.0.

Description

With the release of scikit-learn 1.6.0, some metric functions (e.g., precision_score, recall_score, f1_score) may return float values instead of numpy arrays for single-value results. The current implementation in evaluate assumes the presence of a size attribute for all outputs, which causes an AttributeError when handling scalar outputs.

This PR modifies the return statement to safely handle both numpy arrays and scalar outputs using getattr(score, 'size', 1), making the metrics compatible with both scikit-learn 1.6.0 and earlier versions.

Changes

Modified return statements in three metrics:

  • metrics/precision/precision.py
  • metrics/recall/recall.py
  • metrics/f1/f1.py

Changed from:

return {"metric_name": float(score) if score.size == 1 else score}

to:

return {"metric_name": score if getattr(score, 'size', 1) > 1 else float(score)}

@Maxwell-Jia
Copy link
Author

Hi @albertvillanova! I noticed you've recently reviewed some PRs in this area. I hope it's okay to bring this to your attention - this PR addresses the scikit-learn 1.6.0 compatibility issue (from #655) that's affecting several users. Would really appreciate your insights when you have a moment. Thank you for your time!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Precision and recall metrics doesn't work with scikit-learn==1.6.0
1 participant