You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
the code weights[inds] = tot / num_in_bin loss = F.binary_cross_entropy_with_logits( input, target, weights, reduction='sum') / tot
same as weights[inds] = 1 / num_in_bin , and combination with weights = weights / n,
weighted logits may be one percent or one thousandths of origin weighted if there are many samples in one bins。
if there is something wrong with my understanding , please tell me.
The text was updated successfully, but these errors were encountered:
Your understanding is right, and our target is just making the weight of these samples small. The motivation and details can be seen in our paper https://arxiv.org/abs/1811.05181
the code
weights[inds] = tot / num_in_bin
loss = F.binary_cross_entropy_with_logits( input, target, weights, reduction='sum') / tot
same as
weights[inds] = 1 / num_in_bin
, and combination withweights = weights / n
,weighted logits may be one percent or one thousandths of origin weighted if there are many samples in one bins。
if there is something wrong with my understanding , please tell me.
The text was updated successfully, but these errors were encountered: