@qiangqiangsir the PyTorch 1.11 wheel was the last one to be built with USE_DISTRIBUTED:
PyTorch v1.11.0
- JetPack 5.0 (L4T R34.1) / JetPack 5.0.2 (L4T R35.1) / JetPack 5.1 (L4T R35.2.1) / JetPack 5.1.1 (L4T R35.3.1)
- Python 3.8 -
torch-1.11.0-cp38-cp38-linux_aarch64.whl
If you require a newer version of PyTorch with distributed enabled, please see this thread for instructions on building PyTorch from source:
Or perhaps it’s possible to disable distributed mode in the mmpretrain library you are using?