Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ONNX session conversion taking long time if model has more estimators #1128

Open
hanzigs opened this issue Oct 2, 2024 · 9 comments
Open

Comments

@hanzigs
Copy link

hanzigs commented Oct 2, 2024

Can I have some help on this please
I have 2 pyod Isolation Forest algorithms,
one with 100 estimators, another with 750 estimators
attached both modelProto files in compressed text files
IsolationForestestimators.zip

versions

Windows
python 3.10.13
onnx==1.13.1
onnxconverter-common==1.13.0
onnxmltools==1.11.2
onnxruntime==1.14.1
skl2onnx==1.14.0
tf2onnx==1.13.0

Code

import onnx
import onnxruntime
import time
testData = [[7.290587738061329e-05, 0.3673094582185491, 0.0535714285714286, 0.2941176470588235]]
start = time.time()

ONNXModel = {  }  # model from attached text file

content = ONNXModel.SerializeToString()
sess = onnxruntime.InferenceSession(content)  # This part taking long time

input_name = sess.get_inputs()[0].name
label_name = sess.get_outputs()[0].name

pred_onnx = sess.run([label_name], {input_name: testData_for_model})[0]
pred_onnx

end = time.time()
print('Time : ',end - start,'Seconds') #in seconds

getting below times
image

The session conversion taking long time, is this anything to do with library

Using this for pyod model to onnx conversion https://github.com/onnx/sklearn-onnx/blob/main/docs/tutorial/plot_wext_pyod_forest.py
any help much appreciate

@xadupre
Copy link
Collaborator

xadupre commented Oct 2, 2024

Is it possible to know which part of your script is taking most of the time? This PR microsoft/onnxruntime#22043 should reduce the loading time in onnxruntime.

@hanzigs
Copy link
Author

hanzigs commented Oct 2, 2024

Thank you for the reply @xadupre
The session conversion taking the time

sess = onnxruntime.InferenceSession(content)

@xadupre
Copy link
Collaborator

xadupre commented Oct 2, 2024

Ok then i think the recent code change i made in onnxruntime should solve it.

@hanzigs
Copy link
Author

hanzigs commented Oct 2, 2024

Thank you, may I please know which pip version has the change, I'm using onnxruntime==1.14.1, I can see pip 1.19.2 is available, but which version is compatible with my above version set without dependency issues affecting the environment.
My visual studio has below set
image

@xadupre
Copy link
Collaborator

xadupre commented Oct 3, 2024

Release 1.20 is scheduled before december. I'm not aware of any dependency issue.

@hanzigs
Copy link
Author

hanzigs commented Oct 3, 2024

May I please know which version after 1.14.1 has the fix for that issue, I can run through that

@xadupre
Copy link
Collaborator

xadupre commented Oct 3, 2024

It will be available in the next release 1.20.

@hanzigs
Copy link
Author

hanzigs commented Oct 7, 2024

I'm waiting for that release, some of the models taking very long time for prediction.

@hanzigs
Copy link
Author

hanzigs commented Nov 18, 2024

Hi @xadupre
I tried onnxruntime==1.20.0

Windows
python 3.10.13
onnx==1.13.1
onnxconverter-common==1.13.0
onnxmltools==1.11.2
onnxruntime==1.20.0
skl2onnx==1.14.0
tf2onnx==1.13.0

Timing didn't change much
This is with 1.20.0, Model with 100 Estimators and 750 Estimators
image
This is with 1.14.1 from the 1st comment
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants