You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
platform: Jetson AGX Orin 64GB
OS: 5.1.2
DLA: 3.12.1
Sigmoid layers are used as the output of the model and the input & output shape of sigmoid is (8, 3, 88, 160). I found the accuracy of fp16 dla model drops a lot when I use sigmoid as output layer. However, the outputs is consistent to torch outputs if the sigmoid is removed, with the cosine similarity close to 1.
I want to know what is the limitations on the use of sigmoid layers ?
Why does this loss of precision occur ?
The text was updated successfully, but these errors were encountered:
@Railcalibur Is this issue still occurring with the latest JetPack 6.0? Also, does it occur with batch size 1, such that the shape is (1, 3, 88, 160). Thanks!
platform: Jetson AGX Orin 64GB
OS: 5.1.2
DLA: 3.12.1
Sigmoid layers are used as the output of the model and the input & output shape of sigmoid is (8, 3, 88, 160). I found the accuracy of fp16 dla model drops a lot when I use sigmoid as output layer. However, the outputs is consistent to torch outputs if the sigmoid is removed, with the cosine similarity close to 1.
I want to know what is the limitations on the use of sigmoid layers ?
Why does this loss of precision occur ?
The text was updated successfully, but these errors were encountered: