Skip to main content

Video Sources

The SDK supports USB cameras, RTSP streams, video files, and multiple simultaneous inputs. This page covers how to configure each.

Quickstart

source venv/bin/activate

# USB camera
./inference.py yolov5s-v7-coco usb:0

# Video file
./inference.py yolov5s-v7-coco media/traffic1_1080p.mp4

# RTSP stream
./inference.py yolov5s-v7-coco rtsp://192.168.1.100:8554/stream1

# Multiple sources
./inference.py yolov5s-v7-coco usb:0 usb:1 media/traffic1_1080p.mp4

Prerequisites

  • SDK installed and environment activated (see Install the SDK)
  • At least one input source available (camera, video file, or RTSP stream)
Before every session
source venv/bin/activate

USB cameras

USB cameras are enumerated at /dev/video0, /dev/video1, etc. Use the shorthand usb:0, usb:1.

./inference.py yolov5s-v7-coco usb:0

Set resolution and framerate

Use the format usb:N:WIDTHxHEIGHT@FPS:

./inference.py yolov5s-v7-coco usb:0:1920x1080@30

Check supported formats

Install v4l-utils to inspect camera capabilities:

sudo apt-get install v4l-utils
v4l2-ctl --device /dev/video0 --list-formats-ext
Multiple sensors

Many USB cameras expose multiple sensors (RGB, infrared, etc.). The RGB sensor is almost always the first listed. If you have two cameras, the second camera's RGB sensor may be at usb:4 rather than usb:1, depending on how many sensors the first camera exposes.

Consistent ordering

Linux may enumerate USB cameras in a different order after each boot. To ensure consistent ordering, plug cameras in sequence after the system has booted.

RTSP cameras

Use the RTSP URL directly:

./inference.py yolov5s-v7-coco rtsp://192.168.1.100:8554/stream1

The general format is rtsp://\<user\>:\<password\>@\<host\>:\<port\>/\<path\>. Omit user and password if the stream does not require authentication.

note

You must have an RTSP server running and accessible on your network before using this source type. The SDK connects to the stream as a client — it does not create an RTSP server. If you see a "Could not open resource" or timeout error, check that the server is running and the URL is correct.

Video files

Specify the file path:

./inference.py yolov5s-v7-coco media/traffic1_1080p.mp4

Throttle playback framerate

You can append @FPS to a video path to limit the playback rate:

./inference.py yolov5s-v7-coco media/traffic1_1080p.mp4@10

This throttles playback to 10 FPS using a wall-clock sleep per frame. It will not speed up playback beyond what the pipeline can deliver — if the pipeline is already slower than the requested FPS, the sleep is skipped. The timing is approximate (not GStreamer clock-synchronized).

The SDK ships with sample videos in the media/ directory. If these are missing, download them:

deactivate
./install.sh --media
source venv/bin/activate

Validation dataset (dataset)

dataset is a special source that runs inference against the model's official validation dataset — the same data used to measure the accuracy figures in the Model Zoo. Use it to verify or reproduce published accuracy numbers.

./inference.py yolov5s-v7-coco dataset --no-display

The SDK downloads the validation data automatically if it is not already present. For most models (COCO, ImageNet) the download is automatic. A small number of datasets require manual download due to licensing restrictions — the SDK prints an error with the expected path and download instructions if this applies.

note

dataset loads individual images rather than a video stream, so the reported FPS will be lower than with a camera or video file. This is expected — the bottleneck is image loading, not the AIPU.

See Measure Accuracy for the full workflow.


Multiple sources

List multiple sources to run inference on all of them simultaneously:

./inference.py yolov5s-v7-coco usb:0 usb:1 usb:2
./inference.py yolov5s-v7-coco media/traffic1_1080p.mp4 media/traffic2_720p.mp4

You can mix source types:

./inference.py yolov5s-v7-coco usb:0 media/traffic1_1080p.mp4

Source reference

Source typeSyntaxExample
USB camerausb:Nusb:0
USB with resolutionusb:N:WxH@FPSusb:0:1920x1080@30
RTSP streamrtsp://...rtsp://192.168.1.100:8554/1
Video filepath/to/filemedia/traffic1_1080p.mp4
Validation datasetdatasetdataset (see above)
Application integration

These same source strings work in the Python API. Pass them to InferenceStream exactly as you would to inference.py. See the inference.py reference for all available options.


Next steps