Holoscan Sensor Bridge v2.0.0

Running Holoscan Sensor Bridge examples

Holoscan sensor bridge Python example applications are located under the examples directory.

Below are instructions for running the applications on the IGX and the Jetson AGX platforms.

  • Examples starting with the word “linux_” in the filename use the unaccelerated Linux Sockets API network receiver operator. These examples work on both IGX and AGX systems.

  • Examples without “linux_” in the filename use the accelerated network receiver operator and require ConnectX SmartNIC controllers, like those on IGX. AGX systems cannot run these examples.

  • These examples all work on both iGPU and dGPU configurations. If the underlying OS and Holoscan sensor bridge container are built with the appropriate iGPU or dGPU setting, the application code itself does not change.

Most examples have both the accelerated and an unaccelerated Linux Sockets API version.

To run the high-speed video player with IMX274, in the demo container with a ConnectX accelerated network controller,

Copy
Copied!
            

$ python3 examples/imx274_player.py

or, for unaccelerated configurations (e.g. AGX),

Copy
Copied!
            

$ python3 examples/linux_imx274_player.py

The C++ examples need to be built first using these commands; this leaves the resulting executables in /tmp/build/examples.

Copy
Copied!
            

$ export BUILD_DIR=/tmp/build $ cmake -S . -B $BUILD_DIR -G Ninja -DHOLOLINK_BUILD_PYTHON=OFF $ cmake --build $BUILD_DIR -j $(nproc)

After examples are built, you can run the imx274_player:

Copy
Copied!
            

$ $BUILD_DIR/examples/imx274_player

Note that only the C++ example is only supported with the accelerated network receiver.

Documentation breaking down the source code for the IMX274 player application is available here; this example illustrates the basic sensor bridge workflow which is described in the architecture documentation. Press Control/C to stop the video player.

The tao-peoplenet example demonstrates running inference on a live video feed. Tao PeopleNet provides a model that given an image can detect persons, bags, and faces. In this example, when those items are detected, bounding boxes are shown as an overlay over the live video.

Prerequisite: Download the PeopleNet ONNX model from the NGC website:

Copy
Copied!
            

wget --content-disposition 'https://api.ngc.nvidia.com/v2/models/org/nvidia/team/tao/peoplenet/pruned_quantized_decrypted_v2.3.3/files?redirect=true&path=resnet34_peoplenet_int8.onnx' -O examples/resnet34_peoplenet_int8.onnx

For systems with accelerated network interfaces,

Copy
Copied!
            

$ python3 examples/tao_peoplenet.py

or unaccelerated configurations,

Copy
Copied!
            

$ python3 examples/linux_tao_peoplenet.py

This will bring up the Holoscan visualizer on the GUI showing the live video feed from the IMX274 device as well as red/green box overlays when a person image is captured. Press Ctrl/C to exit. More information about this application can be found here.

Prerequisite: Download the YOLOv8 ONNX model from the YOLOv8 website and generate the body pose ONNX model. Within the Holoscan sensor bridge demo container:

From the repo base directory holoscan-sensor-bridge:

Copy
Copied!
            

apt-get update && apt-get install -y ffmpeg pip3 install ultralytics onnx cd examples yolo export model=yolov8n-pose.pt format=onnx trtexec --onnx=yolov8n-pose.onnx --saveEngine=yolov8n-pose.engine.fp32 cd -

Note that this conversion step only needs to be executed once; the yolov8n-pose.engine.fp32 file contains the converted model and is all that’s needed for the demo to run. The installed components will be forgotten when the container is exited; those do not need to be present in future runs of the demo.

For systems with accelerated network interfaces, within the sensor bridge demo container, launch the Body Pose estimation:

Copy
Copied!
            

$ python3 examples/body_pose_estimation.py

For unaccelerated configurations (e.g. AGX), launch the Body Pose estimation example within the demo container this way:

Copy
Copied!
            

$ python3 examples/linux_body_pose_estimation.py

This will bring up the Holoscan visualizer on the GUI showing the live video feed from the IMX274 device, along with a green overlay showing keypoints found by the body pose net model. For more information about this application, look here.

Press Ctrl/C to exit.

For IGX, examples/stereo_imx274_player.py shows an example with two independent pipelines, one for each camera on the dual-camera module. Accelerated networking is used to provide real time access to the pair of 4k image streams. Make sure that both network ports are connected between the IGX and the Holoscan sensor bridge unit.

Copy
Copied!
            

$ python3 examples/stereo_imx274_player.py

This brings up a visualizer display with two frames, one for the left channel and the other for the right.

For AGX configurations, you can observe both cameras using a single network port:

Copy
Copied!
            

$ python3 examples/linux_single_network_stereo_imx274_player.py

Applications wishing to map sensors to specific data channels can do so using the use_sensor API, which is demonstrated in these examples. The AGX network interface is limited to 10Gbps so support is only provided for observing stereo video in 1080p mode.

examples/gpio_example_app.py is a simple example of using the GPIO interface of the sensor bridge to set GPIO directions, read input values from GPIO pins and write output values to GPIO pins. To run the appliction:

Copy
Copied!
            

$ python3 examples/gpio_example_app.py

This brings up a textual display which cycles over different pre-set pin configurations and allows time between different settings of the pins to measure or readback pins values. Please refer to the application structure section to read more about the GPIO example application.

examples/linux_hwisp_player.py shows an example of NVIDIA ISP unit processing the Bayer frame captured live using IMX274. The ISP unit currently is available on Jetson Orin AGX and IGX Orin in iGPU configuration.

Before starting the docker run, setup the nvargus-daemon with the flag enableRawReprocessing=1. This enables us to run the ISP with the Bayer frame capture using Holoscan sensor bridge unit and this change persists through even restart. In the host system:

Copy
Copied!
            

sudo su pkill nvargus-daemon export enableRawReprocessing=1 nvargus-daemon exit

To run the example, within the demo container:

Copy
Copied!
            

$ python3 examples/linux_hwisp_player.py

This will run the application with visualizer display showing the live capture.

Note if user wishes to undo running the nvargus-daemon with flagenableRawReprocessing=1, then please execute following command.

Copy
Copied!
            

sudo su pkill nvargus-daemon unset enableRawReprocessing nvargus-daemon exit

For IGX systems, examples/imx274_latency.py shows an example of how to use timestamp to profile hardware and software pipeline. This example demonstrates recording timestamps received from the FPGA when data is acquired and timestamps measured in the host at various points in frame reception and pipeline execution. At the end of the run, the application will provide a duration and latency report with average, minimum, and maximum values.

Before running the app, make sure the PTP sync has been enabled on the setup and then use the following commands to run the example.

Copy
Copied!
            

$ python3 examples/imx274_latency.py

Previous Build and Test the Holoscan Sensor Bridge demo container
Next Application structure
© Copyright 2022-2024, NVIDIA. Last updated on Feb 4, 2025.