-
Notifications
You must be signed in to change notification settings - Fork 212
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Plan to support for ONNX #83
Comments
Thanks for reporting this question, we plan to support ONNX for a partial of DETR-like models in the future. Cuz some custom operators like |
Excuse me, when will you launch onnx? |
We will start to explore detrex onnx export these days!Please stay tuned : ) |
@rentainhe Any updates on this? I've successfully exported MaskDINO to ONNX, but something is broken, the model is way worse and smaller. Still analyzing... |
@powermano has provided a detailed usage in this issue #192, maybe you can refer to this for more details @ichitaka , we did not have time to export maskdino to onnx these days but we will try to figure it out in the future, very sorry. |
I'm also very interested in MaskDINO to ONNX, have you used @powermano's script? @ichitaka |
@alrightkami hi, Do you have the script export MaskDINO into ONNX,thanks |
I have tried converting DETA Pytorch -> ONNX into TensorRT inference. There is a workaround solution for MSMHDA. |
Anyone had success with exporting Mask Dino to ONNX or any other format for deployment? |
@SergiyShebotnov I successfully deployed it using TorchTensorRT. Had to add the custom CUDA kernels by rewriting them a bit. It was a bit of work. Sadly can't share it, since my old customer owns that. Look at the custom modules sections in the docs. If you know C++, you should be fine. |
Can you share little bit of detail which part should be converted to C++? (MSHDA perhaps?) |
Yeah it's the only module needed.
…________________________________
From: Sangbum Daniel Choi ***@***.***>
Sent: Sunday, December 3, 2023 11:52:24 AM
To: IDEA-Research/detrex ***@***.***>
Cc: ichitaka ***@***.***>; Mention ***@***.***>
Subject: Re: [IDEA-Research/detrex] Plan to support for ONNX (Issue #83)
@SergiyShebotnov<https://github.com/SergiyShebotnov> I successfully deployed it using TorchTensorRT. Had to add the custom CUDA kernels by rewriting them a bit. It was a bit of work. Sadly can't share it, since my old customer owns that. Look at the custom modules sections in the docs. If you know C++, you should be fine.
Can you share little bit of detail which part should be converted to C++? (MSHDA perhaps?)
—
Reply to this email directly, view it on GitHub<#83 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/ACQG7XC64QSVY5EKAKK5QBDYHRRXRAVCNFSM6AAAAAAQXN2VUWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQMZXGQ2TONRWG4>.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
I'm trying to export MaskDINO to TorchScript (using trace) following the Detectron2's deploy guide, but it turns out the traced model does not generate same result since backbone (Swin backbone used here) inference step. Does anyone know that whether the Swin backbone can be correctly traced? Thanks. |
Hello,
Thanks for your release of DETR-like models! As I know, DETR can be converted to ONNX now, will you have plan to support these DETR-like models for ONNX?
I am looking forward to your reply, thanks!
The text was updated successfully, but these errors were encountered: