Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Accuracy Metrics #10

Open
ssk-2610 opened this issue Jun 13, 2021 · 1 comment
Open

Accuracy Metrics #10

ssk-2610 opened this issue Jun 13, 2021 · 1 comment

Comments

@ssk-2610
Copy link

How do you find the accuracy metrics with the help of score_map for a dataset?

@denguir
Copy link
Owner

denguir commented Jun 17, 2022

Hi @ssk-2610, I suppose you are wondering how we can obtain the ROC curve,

To generate a ROC curve for your datasets you will need a ground truth for your anomalies (note that this is not used for the training of the model but only for evaluation purposes). The ground truth (refered to as gt in the code) corresponds to a mask where pixels are white in anomalous region (and black pixel everywhere else).

In the following piece of code you obtain an anomaly score for each pixel of your input image and you vectorize the score map as well as the ground truth mask.

score_map = get_score_map(inputs, teacher, students, params).cpu()
y_score = np.concatenate((y_score, rearrange(score_map, 'b h w -> (b h w)').numpy()))
y_true = np.concatenate((y_true, rearrange(gt, 'b c h w -> (b c h w)').numpy()))

Then you can simply use sklear roc_curve function:

fpr, tpr, thresholds = roc_curve(y_true.astype(int), y_score)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants