PeopleNet Inference Not Working on Jetson Orin Nano
Issue Overview
A user recently purchased a Jetson Orin Nano and flashed it with JetPack 5.13. Despite their efforts, they are unable to get PeopleNet, a pre-trained object detection model, to function properly. The issue appears to be related to the setup and execution of the PeopleNet model from NVIDIA’s TAO (Train, Adapt, and Optimize) Toolkit on the Jetson Orin Nano platform.
Possible Causes
- Incomplete or incorrect installation of the jetson-inference library
- Missing dependencies for running TAO models
- Incompatibility between the installed JetPack version and the PeopleNet model
- Incorrect configuration or usage of the PeopleNet model
- Hardware-specific issues with the Jetson Orin Nano
- Corrupted model files or incorrect file paths
Troubleshooting Steps, Solutions & Fixes
-
Verify jetson-inference installation:
- Ensure that the jetson-inference library is properly installed on your system. Follow these steps to build the project from source:
sudo apt-get update sudo apt-get install git cmake libpython3-dev python3-numpy git clone --recursive --depth=1 https://github.com/dusty-nv/jetson-inference cd jetson-inference mkdir build cd build cmake ../ make -j$(nproc) sudo make install sudo ldconfig
-
Check PeopleNet model availability:
- Confirm that the PeopleNet model is correctly downloaded and accessible. The model should be available as part of the jetson-inference package.
-
Verify model usage:
- Ensure you’re using the correct command-line argument for PeopleNet. Use
peoplenet
orpeoplenet-pruned
depending on which version you intend to use.
- Ensure you’re using the correct command-line argument for PeopleNet. Use
-
Test with a sample image:
- Try running the PeopleNet model on a sample image to isolate any issues:
detectnet --model=peoplenet --input_blob=input_0 --output_cvg=scores --output_bbox=boxes /path/to/your/image.jpg output.jpg
-
Check system logs:
- Examine the system logs for any error messages or warnings related to the PeopleNet model or jetson-inference library:
journalctl -xe
-
Verify CUDA and cuDNN installation:
- Ensure that CUDA and cuDNN are correctly installed and configured on your Jetson Orin Nano. You can check the versions with:
nvcc --version cat /usr/local/cuda/include/cudnn.h | grep CUDNN_MAJOR -A 2
-
Update JetPack:
- Consider updating to the latest version of JetPack if you’re not already using it. This can resolve compatibility issues:
sudo apt update sudo apt upgrade
-
Check for model-specific issues:
- Visit the TAO PeopleNet model page to check for any known issues or specific requirements for the Jetson Orin Nano.
-
Verify Python environment:
- Ensure you’re using the correct Python environment if you’re running PeopleNet through a Python script. Check your Python path and installed packages:
which python3 pip list
-
Test with other TAO models:
- Try running other TAO models like DashCamNet or TrafficCamNet to determine if the issue is specific to PeopleNet or affects all TAO models.
If these steps do not resolve the issue, consider posting a more detailed description of the problem, including any error messages you encounter, in the NVIDIA Developer Forums for further assistance.