Use of LangChain and LangGraph in Jetson Orin AGX

Hi everyone,

I’m exploring the possibility of using LangChain and LangGraph to implement a reasoning system powered by a LLaMA-based LLM, with plans to integrate additional agents for tasks like computer vision, robotics, and more. However, I’m relatively new to both Jetson Orin AGX and these frameworks, so I could use some guidance.

Would LangChain or LangGraph be a good fit for the Jetson Orin AGX platform? My main concerns are:

  1. Performance – Can they take full advantage of the Jetson Orin’s GPU acceleration?
  2. Suitability – Are these frameworks capable of utilizing the Orin AGX’s potential effectively, or are there better alternatives for deploying such multi-agent systems on this hardware?

I’m also interested in hearing about any libraries or tools that might be a better fit for running LLMs and multi-agent setups directly on the Jetson Orin AGX, especially ones that can integrate well with Nvidia’s SDKs or leverage GPU acceleration effectively.

Any insights, advice, or pointers to resources would be greatly appreciated!

Thanks in advance!

Hi,
Here are some suggestions for the common issues:

1. Performance

Please run the below command before benchmarking deep learning use case:

$ sudo nvpmodel -m 0
$ sudo jetson_clocks

2. Installation

Installation guide of deep learning frameworks on Jetson:

3. Tutorial

Startup deep learning tutorial:

4. Report issue

If these suggestions don’t help and you want to report an issue to us, please attach the model, command/step, and the customized app (if any) with us to reproduce locally.

Thanks!

Would you be able to provide more details about the compatibility of LangChain with the ecosystem of Nvidia ? Also, thank you so much for the resources!

Hi,

On Jetson, we recommend using MLC or TensorRT natively.
You can find some tutorial and benchmark results in our lab below:

Thanks.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.