You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When adding torch.nn.ReLU6 activation after Conv2D, the converter doesn't fuse it the Conv.
Only by tweaking Relu6 a little bit it is fused, but an additional Relu is appended to it.
Actual vs expected behavior:
The outcome of the conversion is a sequence of Conv2D -> Min -> Relu.
Expected output would be a single Conv2d fused with Relu6.
Description of the bug:
When adding torch.nn.ReLU6 activation after Conv2D, the converter doesn't fuse it the Conv.
Only by tweaking Relu6 a little bit it is fused, but an additional Relu is appended to it.
Actual vs expected behavior:
The outcome of the conversion is a sequence of Conv2D -> Min -> Relu.
Expected output would be a single Conv2d fused with Relu6.
Any other information you'd like to share?
Sample code to reproduce:
LEFT : relu6.tflite
RIGHT: custom_relu6.tflite:
The text was updated successfully, but these errors were encountered: