nvidia.dali.fn.jitter¶
- nvidia.dali.fn.jitter(*inputs, **kwargs)¶
- Performs a random Jitter augmentation. - The output images are produced by moving each pixel by a random amount, in the x and y dimensions, and bounded by half of the - nDegreeparameter.- Supported backends
- ‘gpu’ 
 
 - Parameters:
- input (TensorList ('HWC')) – Input to the operator. 
- Keyword Arguments:
- bytes_per_sample_hint (int or list of int, optional, default = [0]) – - Output size hint, in bytes per sample. - If specified, the operator’s outputs residing in GPU or page-locked host memory will be preallocated to accommodate a batch of samples of this size. 
- fill_value (float, optional, default = 0.0) – Color value that is used for padding. 
- interp_type ( - nvidia.dali.types.DALIInterpType, optional, default = DALIInterpType.INTERP_NN) – Type of interpolation used.
- mask (int or TensorList of int, optional, default = 1) – - Determines whether to apply this augmentation to the input image. - Here are the values: - 0: Do not apply this transformation. 
- 1: Apply this transformation. 
 
- nDegree (int, optional, default = 2) – Each pixel is moved by a random amount in the - [-nDegree/2, nDegree/2]range
- preserve (bool, optional, default = False) – Prevents the operator from being removed from the graph even if its outputs are not used. 
- seed (int, optional, default = -1) – - Random seed. - If not provided, it will be populated based on the global seed of the pipeline.