VPI - Vision Programming Interface

0.4.4 Release

Lens Distortion Correction


VPI provides functions that, together with Remap algorithm, perform image rectification. The input image can have some level of distortion caused by the camera lens. The end result is an undistorted image that can optionally be reprojected into a second camera to allow, for instance, realignment of input camera's optical axis. This makes it an important stage in certain computer stereo vision applications, such as depth estimation, where two cameras must have their optical axis level and parallel.

The following types of distortion models are included:

  • Polynomial distortion - encompasses a broad set of common lens distortions, such as barrel, pincushion, a mix of these, etc.
  • Fisheye distortion - commonly found in fisheye lenses, can be seen as an exaggerated form of barrel distortion.

For other distortions models, users can always resort to creating their own output-to-input mapping, as shown here.

Input Parameters Output

Copyright © 2012 by Michel Thoby, with permission from author.
projection: fisheye equidistant
focus length: 7.5mm APS-C
k1: -0.126
k2: 0.004
k3: 0
k4: 0


The Lens Distortion Correction algorithm is implemented by warping the distorted input image into a rectified, undistorted output image. It does so by performing the inverse transformation; i.e., for every pixel \((u,v)\) in the destination image, calculate the corresponding coordinate \((\check{u},\check{v})\) in the input image.

  1. For each pixel \((u,v)\) in the destination image, calculate its corresponding 3D point \(\mathsf{P_{out}}\), in output camera space using its intrinsics matrix \(\mathsf{K_{out}}\).

    \[ \mathsf{P_{out}} = \mathsf{K_{out}}^{-1} \begin{bmatrix} u \\ v \\ 1 \end{bmatrix} \]

  2. Transform the 3D point \(\mathsf{P_{out}}\) from output camera space to input camera space using the \([\mathsf{R}|\mathsf{t}]^{-1}\) matrix.

    \[ \mathsf{P_{in}} = \mathsf{R}^{-1}(\mathsf{P_{out}}-\mathsf{t}) \]

  3. Apply lens distortion model \(L\) on the ideal (non-distorted) projected point \((\tilde{x},\tilde{y})\), in focal-length units, resulting in distorted point \((x_d,y_d)\). \(s\) is just a scale factor.

    \begin{align*} s \begin{bmatrix} \tilde{x} \\ \tilde{y} \\ 1 \end{bmatrix} &= \begin{bmatrix} x \\ y \\ z \end{bmatrix} = \mathsf{P_{in}} \\ (x_d,y_d) &= L(\tilde{x}, \tilde{y}) \end{align*}

  4. Project distorted point \((\tilde{x},\tilde{y})\) onto the input image space using its intrinsics matrix \(\mathsf{K_{in}}\), resulting in coordinate \((\check{u},\check{v})\). Again, \(s\) is just another scale factor.

    \[ s \begin{bmatrix} \check{u} \\ \check{v} \\ 1 \end{bmatrix} = \mathsf{K_{in}} \begin{bmatrix} x_d \\ y_d \\ 1 \end{bmatrix} \]

  5. Using the user-provided interpolator, sample the input image and assign the result to the corresponding output pixel.

    \[ (u,v) \leftarrow S_{\mathsf{interp}}(\check{u},\check{v}) \]

The following interpolators are supported:

The equations above assume a Pinhole Camera Model. In the diagram shown in the link, the input camera is assumed to be aligned with world coordinate frame, with origin at \(O = (0,0,0)\) and optical axis colinear with world's \(Z_w\) axis. The output camera's origin is located at \(F_c\) and optical axis along \(Z_c\). Taken together, this makes the matrix \([R|t]\) transform points from input's camera space into output's.

Lens Distortion Models

These equations above assume that projection is a linear operation. In reality, this is hardly the case. Lens distortions make straight lines in the real world appear projected as bent in the captured image. In order to take this into account, the distortion model is applied to the ideal, distortion-free coordinates in input camera space corresponding to the output image pixel coordinate being rendered. The resulting coordinates are the actual projected position on the input image of the rendered pixel in the output image.

VPI comes with functions that handle both polynomial and fisheye distortion models. These models are characterized by distortion coefficients and, in the case of fisheye lenses, the mapping type. The coefficients are unique for each lens and can either be supplied by the manufacturer or estimated by a lens calibration process.

Polynomial Distortion Model

Polynomial distortion model, also known as Brown-Conrady model, allows representing a broad range of lens distortions, such as barrel, pincushion, mustache, etc.

VPI uses the structure VPIPolynomialLensDistortionModel to store the distortion parameters, which eventually is used by the vpiWarpMapGenerateFromPolynomialLensDistortionModel to create a VPIWarpMap that undistorts the input image.

This distortion model is composed of radial and tangential distortion components:

\begin{align*} L(\tilde{x},\tilde{y}) &= L_r(\tilde{x},\tilde{y}) + L_t(\tilde{x},\tilde{y}) \end{align*}

Radial distortion is defined by parameters \(k_1,k_2,k_3,k_4,k_5\) and \(k_6\):

\begin{align*} L_r(\tilde{x},\tilde{y}) &= \frac{1+k_1r^2+k_2r^4+k_3r^6}{1+k_4r^2+k_5r^4+k_6r^6} \begin{bmatrix} \tilde{x} \\ \tilde{y} \end{bmatrix}\\ r^2 &= \tilde{x}^2 + \tilde{y}^2 \end{align*}

Tangential distortion is defined by parameters \(p_1\) and \(p_2\) and is due to imperfect centering of the lens components and other manufacturing defects.

\begin{align*} L_t(\tilde{x},\tilde{y}) &= \begin{bmatrix} 2p_1\tilde{x}\tilde{y} + p_2(r^2+2\tilde{x}^2) \\ p_1(r^2+2\tilde{y}^2) + 2p_2\tilde{x}\tilde{y} \end{bmatrix} \\ r^2 &= \tilde{x}^2+\tilde{y}^2 \end{align*}

Fisheye Distortion Model

Fisheye lens is an extremely wide angle lens that produces strong barrel distortion. One of its uses is to create wide panoramas.

VPI uses the structure VPIFisheyeLensDistortionModel to store the distortion parameters, which eventually is used by the vpiWarpMapGenerateFromFisheyeLensDistortionModel to create a VPIWarpMap that undistorts the input image.

The distortion model is defined by a mapping function \(M_f(\theta)\) that depends on fisheye lens type, and coefficients \(k_1,k_2,k_3\) and \(k_4\) as follows:

\begin{align*} L(\tilde{x},\tilde{y}) &= \frac{r_d}{r} \begin{bmatrix} \tilde{x} \\ \tilde{y} \end{bmatrix} \\ r_d &= M_1(\theta_d) \\ \theta_d &= \theta(1+ k_1\theta^2 + k_2\theta^4 + k_3\theta^6 + k_4\theta^8) \\ \theta &= \arctan(r) \\ r &= \sqrt{\tilde{x}^2 + \tilde{y}^2} \end{align*}


  • \(\theta\) is the incident light angle with respect to camera's optical axis.
  • \(\theta_d\) is the distorted incident light angle, usually due to lens manufacturing defects.
  • \(r_d\) is the distance from principal point where the incident light is recorded on the image.

Fisheye lenses can be classified depending on the relationship between the angle of incident light and where it is recorded on the image, established by the mapping function \(M_f(\theta)\).

In these formulas \(f=1\) as this is the focal length related to the projected \((\tilde{x},\tilde{y})\) coordinates.

VPI supports the following mapping functions, each one with some desirable characteristics:


  1. Initialization phase
    1. Include the header that defines the lens distortion models and Remap algorithm.
    2. Define the input image object.
      VPIImage input = /*...*/;
    3. Create the output image, which in this case has the same dimensions and format as input.
      uint32_t width, height;
      vpiImageGetSize(input, &width, &height);
      vpiImageGetType(input, &type);
      VPIImage output;
      vpiImageCreate(width, height, type, 0, &output);
    4. Create a dense warp map for warping the distorted image into the corrected output.
      memset(&map, 0, sizeof(map));
      map.grid.regionWidth[0] = width;
      map.grid.regionHeight[0] = height;
      map.grid.horizInterval[0] = 1;
      map.grid.vertInterval[0] = 1;
    5. Define the fisheye lens distortion model with mapping type and distortion coefficients. The latter comes from a lens calibration process.
      memset(&fisheye, 0, sizeof(fisheye));
      fisheye.k1 = -0.126;
      fisheye.k2 = 0.004;
      fisheye.k3 = 0;
      fisheye.k4 = 0;
    6. Define the intrinsic and extrinsic camera parameters. The input image was recorded by an APS-C sensor and the lens has focal length of 7.5mm. The principal point is right on image center. Finally, since this is a monocular setup, extrinsic parameters are identity, meaning that input and output cameras are in the same position with optical axis aligned.
      float sensorWidth = 22.2; // APS-C sensor
      float focalLength = 7.5;
      float f = focalLength*width/sensorWidth;
      { f, 0, width/2.0 },
      { 0, f, height/2.0 }
      { 1, 0, 0, 0 },
      { 0, 1, 0, 0 },
      { 0, 0, 1, 0 }
    7. Bake into the warp map correction implied by the lens distortion model defined above.
    8. Create a payload for the remap algorithm that will perform the correction. The payload is created on the CUDA backend, that eventually will execute the algorithm.
    9. Create the stream where the algorithm will be submitted for execution.
      VPIStream stream;
      vpiStreamCreate(0, &stream);
  2. Processing phase
    1. Submit the algorithm to the stream, along with all parameters. We're using a cubic interpolator for maximum quality, and mapped pixels that fall outside source image boundaries are considered black.
    2. Optionally, wait until the processing is done.
  3. Cleanup phase
    1. Free resources held by the stream, the payload, the warp map and the input and output images.

For a complete example, consult the sample application Fisheye Distortion Correction. It implements the whole process of rectifying images captured by a fisheye lens, including the calibration process.

For more details, consult the Lens Distortion Correction API reference.

Limitations and Constraints

Since processing is ultimately done by Remap algorithm, lens distortion correction inherits its limitations and constraints.


The main loop of Lens Distortion Correction uses Remap, therefore performance is dominated by it. Refer to Remap's performance tables.

uint8_t numHorizRegions
Number of regions horizontally.
Definition: WarpGrid.h:158
Horizontal spacing between control points within a given region.
Definition: WarpGrid.h:163
float k4
Definition: LensDistortionModels.h:142
Specifies the equidistant fisheye mapping.
Definition: LensDistortionModels.h:86
VPIStatus vpiStreamCreate(uint32_t flags, VPIStream *stream)
Create a stream instance.
VPIStatus vpiWarpMapAllocData(VPIWarpMap *warpMap)
Allocates the warp map's control point array for a given warp grid.
Height of each region.
Definition: WarpGrid.h:162
Vertical spacing between control points within a given region.
Definition: WarpGrid.h:165
void vpiPayloadDestroy(VPIPayload payload)
Deallocates the payload object and all associated resources.
Declares functions to generate warp maps based on common lens distortion models.
VPIStatus vpiStreamSync(VPIStream stream)
Blocks the calling thread until all submitted commands in this stream queue are done (queue is empty)...
CUDA backend.
Definition: Types.h:91
uint8_t numVertRegions
Number of regions vertically.
Definition: WarpGrid.h:159
struct VPIStreamImpl * VPIStream
A handle to a stream.
Definition: Types.h:190
Declares functions that implement the Remap algorithm.
void vpiStreamDestroy(VPIStream stream)
Destroy a stream instance and deallocate all HW resources.
float VPICameraExtrinsic[3][4]
Camera extrinsic matrix.
Definition: Types.h:491
VPIWarpGrid grid
Warp grid control point structure definition.
Definition: WarpMap.h:91
float k3
Definition: LensDistortionModels.h:142
VPIStatus vpiImageCreate(uint32_t width, uint32_t height, VPIImageFormat fmt, uint32_t flags, VPIImage *img)
Create an empty image instance with the specified flags.
void vpiImageDestroy(VPIImage img)
Destroy an image instance.
VPIStatus vpiCreateRemap(VPIBackend backend, const VPIWarpMap *warpMap, VPIPayload *payload)
Create a payload for Remap algorithm.
Alias to fast Catmull-Rom cubic interpolator.
Definition: Types.h:321
struct VPIImageImpl * VPIImage
A handle to an image.
Definition: Types.h:196
VPIStatus vpiImageGetSize(VPIImage img, uint32_t *width, uint32_t *height)
Get the image size in pixels.
Width of each region.
Definition: WarpGrid.h:161
All pixels outside the image are considered to be zero.
Definition: Types.h:218
struct VPIPayloadImpl * VPIPayload
A handle to an algorithm payload.
Definition: Types.h:208
void vpiWarpMapFreeData(VPIWarpMap *warpMap)
Deallocates the warp map control points allocated by vpiWarpMapAllocData.
Holds coefficients for fisheye lens distortion model.
Definition: LensDistortionModels.h:138
VPIStatus vpiSubmitRemap(VPIStream stream, VPIPayload payload, VPIImage input, VPIImage output, VPIInterpolationType interp, VPIBoundaryCond bcond)
Submits the Remap operation to the stream associated with the payload.
VPIStatus vpiWarpMapGenerateFromFisheyeLensDistortionModel(const VPICameraIntrinsic Kin, const VPICameraExtrinsic X, const VPICameraIntrinsic Kout, const VPIFisheyeLensDistortionModel *distModel, VPIWarpMap *warpMap)
Generates a mapping that corrects image distortions caused by fisheye lenses.
Pre-defined image formats.
Definition: ImageFormat.h:94
Defines the mapping between input and output images' pixels.
Definition: WarpMap.h:88
float k2
Definition: LensDistortionModels.h:142
float k1
Definition: LensDistortionModels.h:142
float VPICameraIntrinsic[2][3]
Camera intrinsic matrix.
Definition: Types.h:478
VPIFisheyeMapping mapping
Mapping between pixel angle and pixel distance to image center.
Definition: LensDistortionModels.h:139
VPIStatus vpiImageGetType(VPIImage img, VPIImageFormat *type)
Get the image format.