You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
According to the given implementation in classification_tasks/utils_eval.py, the features phi is being centered as follows -
Line 502 phi = phi - torch.mean(phi, axis=1, keepdims=True).
However, assuming phi has a dimension (D, f) where D is the dataset size, and f is the feature size. The mean on axis 1 turns out to have the dimension (D,1). But in my opinion, we want the mean feature to have the dimension of (1,f), hence the mean should be applied across the dataset i.e. axis=0.
Kindly look into it.
The text was updated successfully, but these errors were encountered:
Good point. Centering along axis=0 would be more appropriate. But since BatchNorm is used throughout the convolutional networks, I would assume that this doesn't have much influence. I will double check, though.
According to the given implementation in
classification_tasks/utils_eval.py
, the featuresphi
is being centered as follows -Line 502
phi = phi - torch.mean(phi, axis=1, keepdims=True)
.However, assuming
phi
has a dimension(D, f)
whereD
is the dataset size, andf
is the feature size. The mean on axis 1 turns out to have the dimension(D,1)
. But in my opinion, we want the mean feature to have the dimension of(1,f)
, hence the mean should be applied across the dataset i.e. axis=0.Kindly look into it.
The text was updated successfully, but these errors were encountered: