Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

normalize to float in NanoBEIREvaluator, InformationRetrievalEvaluator, MSEEvaluator #3096

Merged
merged 1 commit into from
Nov 28, 2024

Conversation

JINO-ROHIT
Copy link
Contributor

I found NanoBEIREvaluator, InformationRetrievalEvaluator, MSEEvaluator returning a mix of np.floats and native floats.

I zeroed in on the issue and found that the 'prefix_name_to_metrics' was causing the problem, this simple fix just converts to floats the value element while destructing the dictionary.

Fixes #3093

@tomaarsen

@tomaarsen
Copy link
Collaborator

Hello!

Very nice work - I love small but impactful fixes like these.
Thanks a bunch.

  • Tom Aarsen

@tomaarsen tomaarsen merged commit 9970e3d into UKPLab:master Nov 28, 2024
9 checks passed
@JINO-ROHIT
Copy link
Contributor Author

Hiya @tomaarsen
Just remembered that this PR Id raised for BinaryClassificationEvaluator.py where I do a float on the results arent needed anymore with this fix . PR #3076

Should we revert them back since it looks kinda ugly?

@tomaarsen
Copy link
Collaborator

That sounds like a good idea, indeed. Good thinking.

  • Tom Aarsen

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

NanoBeirEvaluator returns a mix of floats and numpy.float64 in the results.
2 participants