Skip to content

nxp-imx/nxp-nnstreamer-examples

Repository files navigation

NXP NNStreamer examples

Purpose of this repository is to provide, for demonstration purpose, functional examples of GStreamer/NNStreamer-based pipelines optimized and validated for some designated NXP i.MX application processors.

How to run examples

Models and metadata download

Models and metadata files used by examples are not archived in this repository. Therefore they have to be downloaded over the network, prior to execution of the examples on the target. Download of those files is to be done from the host PC, running Jupyter Notebook download.ipynb. Refer to download instruction for details.

Execution on target

Python

Once models have been fetched locally on host PC, repository will contain both examples and downloaded artifacts. Thus it can be uploaded to the target board for individual examples execution. Complete repository can either be uploaded from host PC to target using regular scp command or only the necessary directories using upload.sh script provided for host:

# replace <target ip address> by relevant value
cd /path/to/nxp-nnstreamer-examples
./tools/upload.sh root@<target ip address>

C++ with cross-compilation

1- Fetch the models locally on host PC.
2- Build a Yocto BSP SDK for the dedicated i.MX platform. Refer to the imx-manifest to setup the correct building environment, SDK needs to be compiled with bitbake using imx-image-full and populate_sdk command as followed:

bitbake imx-image-full -c populate_sdk

3- Once successfully generated, the Yocto BSP SDK environment setup script located in /path/to/yocto/bld-xwayland/tmp/deploy/sdk/ must be executed.

chmod a+x fsl-imx-<backend>-glibc-x86_64-imx-image-full-armv8-<machine>-toolchain-<release>.sh
./fsl-imx-<backend>-glibc-x86_64-imx-image-full-armv8-<machine>-toolchain-<release>.sh

4- Source the SDK environment:
NOTE: the SDK is installed by default in /opt/fsl-imx-xwayland/<LF_version>/

. /path/to/sdk/environment-setup-armv8-poky-linux

5- Compile C++ examples with CMake:

cd /path/to/nxp-nnstreamer-examples
mkdir build && cd $_
cmake ..
make

6- Push the required artifacts in its expected folder on the board (the scp command can be used for this purpose):
NOTE: path to the folder containing the data can be changed in CMakeLists.txt file as well as model and label names in the cpp example source file.

# Send classification example to target, replacing <target ip address> by relevant value
scp ./classification/example_classification_mobilenet_v1_tflite root@<target ip address>

C++ examples use a set of high-level classes to create optimized pipelines for NXP boards, which use NXP hardware optimizations. A description of how to use this set of classes can be found here.

Compile models on target for i.MX 93

Quantized TFLite models must be compiled with vela for i.MX 93 Ethos-U NPU. This must be done directly on the target:

cd /path/to/nxp-nnstreamer-examples
./downloads/compile_models.sh

Examples can then be run directly on the target. More information on individual examples execution is available in relevant sections.

Tasks

Note that examples may not run on all platforms - check table below for platform compatibility.

Available examples

Snapshot Name Platforms Implementations Models ML engine Features
object detection demo Object Detection i.MX 8M Plus
i.MX 93
i.MX 95
i.MX 952
Bash
C++
MobileNet SSD
YOLOv4 Tiny
TFLite v4l2/libcamera
gst-launch
custom python tensor_filter
classification demo Classification i.MX 8M Plus
i.MX 93
i.MX 95
i.MX 952
Bash
C++
MobileNet TFLite v4l2/libcamera
video file decoding
gst-launch
semantic segmentation demo Semantic Segmentation i.MX 8M Plus
i.MX 93
i.MX 95
i.MX 952
Bash
C++
DeepLabV3 TFLite jpeg files slideshow
gst-launch
pose estmation demo Pose Estimation i.MX 8M Plus
i.MX 93
i.MX 95
i.MX 952
C++
Python
MoveNet TFLite v4l2/libcamera
video file decoding
gst-launch
custom model decoding
face processing demo Face Processing i.MX 8M Plus
i.MX 93
i.MX 95
i.MX 952
C++
Python
UltraFace-slim
FaceNet512
Deepface-emotion
TFLite v4l2/libcamera
video file decoding
gst-launch
custom model decoding
monocular depth estimation demo Monocular Depth Estimation i.MX 8M Plus
i.MX 93
i.MX 95
i.MX 952
C++ MiDaS v2 TFLite v4l2/libcamera
video file decoding
gst-launch
custom model decoding
mixed demo Mixed Demos i.MX 8M Plus
i.MX 93
i.MX 95
i.MX 952
C++ MobileNet SSD
MobileNet
MoveNet
UltraFace-slim
Deepface-emotion
TFLite v4l2/libcamera
video file decoding
gst-launch
custom model decoding
video file encoding

Images and video used have been released under Creative Commons CC0 1.0 license or belong to Public Domain. Individual attribution and license information for each image can be found in MEDIA_LICENSES.txt.

Use libcamera backend instead of v4l2 on i.MX 95

Run "cam -l" command to find the camera path (/base/soc/<camera_path>). Then export following environment variables:

export LIBCAMERA_PIPELINES_MATCH_LIST='nxp/neo,imx8-isi,uvcvideo'
export CAMERA_BACKEND='libcamera'
export CAMERA_DEVICE='/base/soc/<camera_path>'

NOTE:

  • A default camera device path is set to the first camera found by cam -l command.
  • USB cameras are currently not supported by libcamera backend.
  • More information is available on the i.MX Linux User's Guide in libcamera section

About

NNStreamer pipeline examples for NXP platforms

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors