nemo_curator.utils.decoder_utils
Module Contents
Classes
Functions
API
Bases: enum.Enum
Policy for extracting frames from video content.
This enum defines different strategies for selecting frames from a video, including first frame, middle frame, last frame, or a sequence of frames.
Configuration for frame extraction parameters.
This class combines extraction policy and target frame rate into a single signature that can be used to identify and reproduce frame extraction settings.
Convert frame extraction signature to string format.
Returns: str
String representation of extraction policy and target FPS.
Bases: enum.Enum
Purpose for extracting frames from video content.
This enum defines different purposes for extracting frames from a video, including aesthetics and embeddings.
Bases: NamedTuple
Container for video frame dimensions.
This class stores the height and width of video frames as a named tuple.
Metadata for video content including dimensions, timing, and codec information.
This class stores essential video properties such as resolution, frame rate, duration, and encoding details.
Convert various input types into a binary stream for video processing.
This function handles different input types that could represent video data and converts them into a consistent BinaryIO interface that can be used for video processing operations.
Parameters:
The input video data, which can be one of:
- Path: A path to a video file
- bytes: Raw video data in bytes
- io.BytesIO: An in-memory binary stream
- io.BufferedReader: A buffered binary file reader
- BinaryIO: Any binary stream
Returns: BinaryIO
A binary stream containing the video data
Raises:
ValueError: If the input type is not one of the supported types
Decode video frames from a binary stream using PyAV with configurable frame rate sampling.
This function decodes video frames from a binary stream at a specified frame rate. The frame rate does not need to match the input video’s frame rate. It is possible to supersample a video as well as undersample.
Parameters:
An open file, io.BytesIO, or bytes object with the video data.
Frame rate for sampling the video
Optional array of presentation timestamps for each frame in the video. If supplied, this array must be monotonically increasing. If not supplied, timestamps will be extracted from the video stream.
Optional start timestamp for frame extraction. If None, the first frame timestamp is used.
Optional end timestamp for frame extraction. If None, the last frame timestamp is used.
If True, stop is the last sample. Otherwise, it is not included. Default is True.
PyAv index of the video stream to decode, usually 0.
Format of the video stream, like “mp4”, “mkv”, etc. None is probably best
Number of threads to use for decoding.
Returns: npt.NDArray[np.uint8]
A numpy array of shape (num_frames, height, width, channels) containing the decoded
Raises:
ValueError: If the sampled timestamps differ from source timestamps by more than the specified tolerance
Decode video using PyAV frame ids.
It is not recommended to use this function directly. Instead, use
decode_video_cpu, which is timestamp-based. Timestamps are necessary for
synchronizing sensors, like multiple cameras, or synchronizing video with
GPS and LIDAR.
Parameters:
An open file, io.BytesIO, or bytes object with the video data.
List of frame ids to decode.
List of counts for each frame id. It is possible that a frame id is repeated during supersampling, which can happen in videos with frame drops, or just due to clock drift between sensors.
PyAv index of the video stream to decode, usually 0.
Format of the video stream, like “mp4”, “mkv”, etc. None is probably best
Number of threads to use for decoding.
Returns: npt.NDArray[np.uint8]
A numpy array of shape (frame_count, height, width, channels) containing
Extract frames from a video into a numpy array.
Parameters:
An open file, io.BytesIO, or bytes object with the video data.
The policy for extracting frames.
Frame rate for sampling the video
The target resolution for the frames.
PyAv index of the video stream to decode, usually 0.
Format of the video stream, like “mp4”, “mkv”, etc. None is probably best
Number of threads to use for decoding.
Returns: npt.NDArray[np.uint8]
A numpy array of shape (num_frames, height, width, 3) containing the decoded
Extract metadata from a video file using ffprobe.
Parameters:
Path to video file or video data as bytes.
Returns: VideoMetadata
VideoMetadata object containing video properties.
Find the closest indices in src to each element in dst.
If an element in dst is equidistant from two elements in src, the left index in src is used.
Parameters:
Monotonically increasing array of numbers to match dst against
Monotonically increasing array of numbers to search for in src
Returns: npt.NDArray[np.int32]
Array of closest indices in src for each element in dst
Get the average frame rate of a video.
Parameters:
An open file, io.BytesIO, or bytes object with the video data.
Index of the video stream to decode, usually 0.
Format of the video stream, like “mp4”, “mkv”, etc. None is probably best
Returns: float
The average frame rate of the video.
Get the total number of frames in a video file or stream.
Parameters:
An open file, io.BytesIO, or bytes object with the video data.
Index of the video stream to read from. Defaults to 0, which is typically the main video stream.
Format of the video stream, like “mp4”, “mkv”, etc. None is probably best
Returns: int
The total number of frames in the video stream.
Get timestamps for all frames in a video stream.
The file position will be moved as needed to get the timestamps.
Note: the order that frames appear in a video stream is not necessarily the order that the frames will be displayed. This means that timestamps are not monotonically increasing within a video stream. This can happen when B-frames are present
This function will return presentation timestamps in monotonically increasing order.
Parameters:
An open file, io.BytesIO, or bytes object with the video data.
PyAv index of the video stream to decode, usually 0.
Format of the video stream, like “mp4”, “mkv”, etc. None is probably best
Returns: npt.NDArray[np.float32]
A numpy array of monotonically increasing timestamps.
Sample src at sample_rate rate and return the closest indices.
This function is meant to be used for sampling monotonically increasing numbers, like timestamps. This function can be used for synchronizing sensors, like multiple cameras, or synchronizing video with GPS and LIDAR.
The first element sampled with either or src[0] or the element closest
to start
The last element sampled will either be src[-1] or the element closest
to stop. The last element is only included if it both fits into the
sampling rate and if endpoint=True
This function intentionally has no policy about distance from the closest elements in src to the sample elements. It will return the index of the closest element to the sample. It is up to the caller to define policy, which is why sample_elements is returned.
Parameters:
Monotonically increasing array of elements
Sampling rate
Start element (defaults to first element)
End element (defaults to last element)
If True, stop can be the last sample, if it fits into
the sample rate. If False, stop is not included in the output.
Whether to deduplicate indices. Repeated indices will be reflected in the returned counts array.
Returns: npt.NDArray[np.int32]
Tuple of (indices, counts) where counts[i] is the number of times
Context manager that saves and restores stream position.