-
Notifications
You must be signed in to change notification settings - Fork 98
1. User Guide && Common Issues
Zero Zeng edited this page Mar 16, 2022
·
4 revisions
Take a look at test/test.cpp or test/test.py
// for detailed usage, please refer to Trt.h, it's pretty well commented.
#include "Trt.h"
Trt trt;
// create engine and running context, note that engine file is device specific, so don't copy engine file to new device, it may cause crash
trt.BuildEngine(onnx_model, engineFile); // for onnx model
// you might need to do some pre-processing in input such as normalization, it depends on your model.
trt.CopyFromHostToDevice(input,0); // 0 for input index, you can get it from CreateEngine phase log output.
//run model, it will read your input and run inference. and generate output.
trt.Forward();
// get output.
trt.CopyFromDeviceToHost(output, outputIndex) // you can get outputIndex in CreateEngine phase
// them you can do post processing in output
import sys
sys.path.append("path/to/where_pytrt.so_located/")
import pytrt
# for detailed usage, try uncomment next line
# help(pytrt)
trt = pytrt.Trt()
trt.BuildEngine(onnx_model, engineFile)
trt.CopyFromHostToDevice(input_numpy_array, 0)
trt.Forward()
output_numpy_array = trt.CopyFromDeviceToHost(1)
# post processing