Add a "these results don't look right" button #125
Labels
backend
Requires changes on the backend
frontend
Requires changes on the frontend
not directly solved by unibox
Issue is unrelated to or not superceded by unibox
To have users help us identify cases where the NER isn' tworking well
I would start by deploying only on the "analyze abstracts" app. If the user clicks the button after analyzing the abstract, we store the abstract text in a new collection in our database ("problematic_abstracts"). Then someone can label them and we can add to our training set.
The text was updated successfully, but these errors were encountered: