when running deserialize cuda engine the engine returns none
i used keras model that i convert into onnx file and then converted again to trt file.
i used this github page as refrence keras_imagenet/README_tensorrt.md at master · jkjung-avt/keras_imagenet · GitHub
and used this models:
eyemodeleasier.h5 (12.7 MB)
eyemodeleasier.onnx (4.2 MB)
eyemodeleasier1.trt (4.3 MB)
thanks in advance
1 Like
NVES
January 20, 2022, 8:08am
2
Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:
validating your model with the below snippet
check_model.py
import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command.
https://github.com/NVIDIA/TensorRT/tree/master/samples/opensource/trtexec
In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!
hello
i have allready uploaded all the files in the post before
i run the check_model.py and print(onnx.checker.check_model(model)) and it prints none
for the trtexec --verbose where is the log file located after running the onnx model with trtexec --verbose? thanks
Hi,
This looks like CUDA context related issue while using the model for inference. We recommend you to please share us issue repro scripts and complete error logs to try from our end for better debugging.
Also, please refer to the following samples and make sure your inference script is correct.
This Samples Support Guide provides an overview of all the supported NVIDIA TensorRT 8.4.3 samples included on GitHub and in the product package. The TensorRT samples specifically help in areas such as recommenders, machine comprehension, character...
Thank you.