-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Confidence score jump #2
Comments
Hi, Thank you for being interested in our research and a good suggestion.
More details for sequence MOT20-01
|
Thank you for your quick and detailed analysis! Maybe I didn't express myself clearly. I mean, is there naturally such a clear cutoff for confidence scores in every frame? Namely, Is there a situation where all the detection confidence scores in a certain frame are so smooth that it is impossible to find a clear and convincing threshold to define high and low scores? This may be a problem with the ByteTrack algorithm itself. But I'd like to know if you think it's a strong prior information that there is always a cliff-like interval in the distribution of confidence scores in a frame. Thanks! |
Hi, the answer is "It depends on the detector we use to detect objects". For YOLOX, "there is always (high possibility) a cliff-like interval" with an analysis in our paper. I hope this somehow clears your concerns. |
OK I understand it, thank you!! |
Dear author,
Based on your work, I still have a little doubt.
In your experiment, is there a jump of the confidence score in each frame of the MOT dataset? That is, is the polarization of confidence scores a common phenomenon?
Looking forward to your reply.
Thanks!!
The text was updated successfully, but these errors were encountered: