You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your interest.
The code "nn.Sequential(*[TransformerBlock ...])" will produce a fixed encoder layer and "HyLevel(...)" produces a dynamical layer. The word "fixed" in the paper means that the weights of this module is fixed during inference.
For the 5-degradation setting, you can refer the code of IDR. And maybe we will upload this in the new version.
Interesting work!
How did you freeze the encoders? I can't determine which part of your implementation handles this operation.
The text was updated successfully, but these errors were encountered: