Error while running Tritron server

I was trying out Triton inference server by following this https://docs.nvidia.com/deeplearning/triton-inference-server/user-guide/docs/user_guide/performance_tuning.html documentation. I pull docker image diretly by appending it with XX.YY-py3-sdk-igpu. I followed the code line by line and got this error

Aditionnaly i also followed the tutorials of Concurrent inference and dynamic batching — NVIDIA Triton Inference Server. I faced problem with this “chmod 777 tao-converter”.

Please help me use triton inference server.
For any additional reference i a attaching the jtop Screenshot.

Thank you

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

This issue would be outside of Deepstream.
the container you are using is a triton client. please refer to the explanation in the link. there is no trtionserver in client container.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.