VPI - Vision Programming Interface

0.4.4 Release

Fisheye Distortion Correction

Overview

This sample application performs a fisheye lens calibration using input images taken with the same camera/lens. Then it uses Remap and the calibration data to correct fisheye lens distortion of these images and save the result to disk. The mapping used for distortion correction is VPI_FISHEYE_EQUIDISTANT, which maps straight lines in the scene to straight lines in the corrected image.

This sample shows the following:

  • Creating and destroying a VPI stream.
  • Use OpenCV to estimate the intrinsic and distortion parameters of a fisheye lens given a set of calibration images.
  • Create a VPIWarpMap and use vpiWarpMapGenerateFromFisheyeLensDistortionModel to initialize it to correct the distortion caused by a fisheye lens.
  • Create pipeline that does Convert Image Format to convert from/to NV12 format and run Remap to perform the lens distortion correction.

Lens Calibration

Lens calibration uses a set of images taken by the same camera/lens, each one showing a checkerboard pattern in a different position, so that taken collectively, the checkerboard appears in almost entire field of view. The more images, the more accurate the calibration will be, but typically 10 to 15 images suffice.

Note
On Ubuntu 16.04, the sample code requires OpenCV >= 2.4.10, which isn't available using apt.

VPI samples include a set of input images that can be used. They are found in /opt/nvidia/vpi-0.4/samples/assets/fisheye directory.

To create a set of calibration images for a given lens, do the following:

  1. Print a checkerboard pattern on a piece of paper. VPI provides in samples' assets directory one 10x7 checkerboard file that can be used, named checkerboard_10x7.pdf.
  2. Mount the fisheye lens on a camera.
  3. With the camera in a fixed position, take several pictures showing the checkerboard in different positions, covering a good part of the field of view.

Instructions

The usage is:

./vpi_sample_11_fisheye -c W,H [-s win] <image1> [image2] [image3] ...

where

  • -c W,H: specifies the number of squares the checkerboard pattern has horizontally (W) and vertically (H).
  • -s win: (optional) the width of a window around each internal vertex of the checkerboard (point where 4 squares meet) to be used in a vertex position refinement stage. The actual vertex position will be searched within this window. If this parameter is omitted, the refinement stage will be skipped.
  • imageN: set of calibration images
Note
Since currently only the PVA backend implements Remap, and only on Jetson Xavier series, this sample can be run solely in these devices.

Here's one invocation example:

./vpi_sample_11_fisheye -c 10,7 -s 22 ../assets/fisheye/*.jpg

This will correct the included set of calibration images, all captured using the checkerboard pattern also included. It's using a 22x22 window around each checkerboard internal vertex to refine the vertex position.

Results

Here are some input and output images produced by the sample application:

InputCorrected

Source code

For convenience, here's the code that is also installed in the samples directory.

1 /*
2 * Copyright (c) 2020, NVIDIA CORPORATION. All rights reserved.
3 *
4 * Redistribution and use in source and binary forms, with or without
5 * modification, are permitted provided that the following conditions
6 * are met:
7 * * Redistributions of source code must retain the above copyright
8 * notice, this list of conditions and the following disclaimer.
9 * * Redistributions in binary form must reproduce the above copyright
10 * notice, this list of conditions and the following disclaimer in the
11 * documentation and/or other materials provided with the distribution.
12 * * Neither the name of NVIDIA CORPORATION nor the names of its
13 * contributors may be used to endorse or promote products derived
14 * from this software without specific prior written permission.
15 *
16 * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY
17 * EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
18 * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
19 * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
20 * CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
21 * EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
22 * PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
23 * PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY
24 * OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
25 * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
26 * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
27 */
28 
29 #include <opencv2/core/version.hpp>
30 
31 #if CV_MAJOR_VERSION >= 3
32 # include <opencv2/imgcodecs.hpp>
33 #else
34 # include <opencv2/highgui/highgui.hpp>
35 #endif
36 
37 #include <opencv2/calib3d/calib3d.hpp>
38 #include <opencv2/imgproc/imgproc.hpp>
39 
40 #include <string.h> // for basename(3) that doesn't modify its argument
41 #include <unistd.h> // for getopt
42 #include <vpi/Context.h>
43 #include <vpi/Image.h>
45 #include <vpi/Status.h>
46 #include <vpi/Stream.h>
48 #include <vpi/algo/Remap.h>
49 
50 #include <iostream>
51 #include <sstream>
52 
53 #define CHECK_STATUS(STMT) \
54  do \
55  { \
56  VPIStatus status = (STMT); \
57  if (status != VPI_SUCCESS) \
58  { \
59  char buffer[VPI_MAX_STATUS_MESSAGE_LENGTH]; \
60  vpiGetLastStatusMessage(buffer, sizeof(buffer)); \
61  std::ostringstream ss; \
62  ss << vpiStatusGetName(status) << ": " << buffer; \
63  throw std::runtime_error(ss.str()); \
64  } \
65  } while (0);
66 
67 static void PrintUsage(const char *progname, std::ostream &out)
68 {
69  out << "Usage: " << progname << " <-c W,H> [-s win] <image1> [image2] [image3] ...\n"
70  << " where,\n"
71  << " W,H\tcheckerboard with WxH squares\n"
72  << " win\tsearch window width around checkerboard vertex used\n"
73  << "\tin refinement, default is 0 (disable refinement)\n"
74  << " imageN\tinput images taken with a fisheye lens camera" << std::endl;
75 }
76 
77 struct Params
78 {
79  cv::Size vtxCount; // Number of internal vertices the checkerboard has
80  int searchWinSize; // search window size around the checkerboard vertex for refinement.
81  std::vector<const char *> images; // input image names.
82 };
83 
84 static Params ParseParameters(int argc, char *argv[])
85 {
86  Params params = {};
87 
88  cv::Size cbSize;
89 
90  opterr = 0;
91  int opt;
92  while ((opt = getopt(argc, argv, "hc:s:")) != -1)
93  {
94  switch (opt)
95  {
96  case 'h':
97  PrintUsage(basename(argv[0]), std::cout);
98  return {};
99 
100  case 'c':
101  if (sscanf(optarg, "%u,%u", &cbSize.width, &cbSize.height) != 2)
102  {
103  throw std::invalid_argument("Error parsing checkerboard information");
104  }
105 
106  // OpenCV expects number of interior vertices in the checkerboard,
107  // not number of squares. Let's adjust for that.
108  params.vtxCount.width = cbSize.width - 1;
109  params.vtxCount.height = cbSize.height - 1;
110  break;
111 
112  case 's':
113  if (sscanf(optarg, "%d", &params.searchWinSize) != 1)
114  {
115  throw std::invalid_argument("Error parseing search window size");
116  }
117  if (params.searchWinSize < 0)
118  {
119  throw std::invalid_argument("Search window size must be >= 0");
120  }
121  break;
122  case '?':
123  throw std::invalid_argument(std::string("Option -") + (char)optopt + " not recognized");
124  }
125  }
126 
127  for (int i = optind; i < argc; ++i)
128  {
129  params.images.push_back(argv[i]);
130  }
131 
132  if (params.images.empty())
133  {
134  throw std::invalid_argument("At least one image must be defined");
135  }
136 
137  if (cbSize.width <= 3 || cbSize.height <= 3)
138  {
139  throw std::invalid_argument("Checkerboard size must have at least 3x3 squares");
140  }
141 
142  if (params.searchWinSize == 1)
143  {
144  throw std::invalid_argument("Search window size must be 0 (default) or >= 2");
145  }
146 
147  return params;
148 }
149 
150 int main(int argc, char *argv[])
151 {
152  // We'll create all vpi objects under this context, so that
153  // we don't have to track what objects to destroy. Just destroying
154  // the context will destroy all objects.
155  VPIContext ctx = 0;
156 
157  try
158  {
159  // First parse command line paramers
160  Params params = ParseParameters(argc, argv);
161  if (params.images.empty()) // user just wanted the help message?
162  {
163  return 0;
164  }
165 
166  // Where to store checkerboard 2D corners of each input image.
167  std::vector<std::vector<cv::Point2f>> corners2D;
168 
169  // Store image size. All input images must have same size.
170  cv::Size imgSize = {};
171 
172  for (unsigned i = 0; i < params.images.size(); ++i)
173  {
174  // Load input image and do some sanity check
175  cv::Mat img = cv::imread(params.images[i]);
176  if (img.empty())
177  {
178  throw std::runtime_error("Can't read " + std::string(params.images[i]));
179  }
180 
181  if (imgSize == cv::Size{})
182  {
183  imgSize = img.size();
184  }
185  else if (imgSize != img.size())
186  {
187  throw std::runtime_error("All images must have same size");
188  }
189 
190  // Find the checkerboard pattern on the image, saving the 2D
191  // coordinates of checkerboard vertices in cbVertices.
192  // Vertex is the point where 4 squares (2 white and 2 black) meet.
193  std::vector<cv::Point2f> cbVertices;
194 
195  if (findChessboardCorners(img, params.vtxCount, cbVertices,
196  cv::CALIB_CB_ADAPTIVE_THRESH + cv::CALIB_CB_NORMALIZE_IMAGE))
197  {
198  // Needs to perform further corner refinement?
199  if (params.searchWinSize >= 2)
200  {
201  cv::Mat gray;
202  cvtColor(img, gray, cv::COLOR_BGR2GRAY);
203 
204  cornerSubPix(gray, cbVertices, cv::Size(params.searchWinSize / 2, params.searchWinSize / 2),
205  cv::Size(-1, -1),
206  cv::TermCriteria(cv::TermCriteria::EPS + cv::TermCriteria::COUNT, 30, 0.0001));
207  }
208 
209  // save this image's 2D vertices in vector
210  corners2D.push_back(std::move(cbVertices));
211  }
212  else
213  {
214  std::cerr << "Warning: checkerboard pattern not found in image " << params.images[i] << std::endl;
215  }
216  }
217 
218  // Create the vector that stores 3D coordinates for each checkerboard pattern on a space
219  // where X and Y are orthogonal and run along the checkerboard sides, and Z==0 in all points on
220  // checkerboard.
221  std::vector<cv::Point3f> initialCheckerboard3DVertices;
222  for (int i = 0; i < params.vtxCount.height; ++i)
223  {
224  for (int j = 0; j < params.vtxCount.width; ++j)
225  {
226  // since we're not interested in extrinsic camera parameters,
227  // we can assume that checkerboard square size is 1x1.
228  initialCheckerboard3DVertices.emplace_back(j, i, 0);
229  }
230  }
231 
232  // Initialize a vector with initial checkerboard positions for all images
233  std::vector<std::vector<cv::Point3f>> corners3D(corners2D.size(), initialCheckerboard3DVertices);
234 
235  // Camera intrinsic parameters, initially identity (will be estimated by calibration process).
236  using Mat3 = cv::Matx<double, 3, 3>;
237  Mat3 camMatrix = Mat3::eye();
238 
239  // stores the fisheye model coefficients.
240  std::vector<double> coeffs(4);
241 
242  // VPI currently doesn't support skew parameter on camera matrix, make sure
243  // calibration process fixes it to 0.
244  int flags = cv::fisheye::CALIB_FIX_SKEW;
245 
246  // Run calibration
247  {
248  cv::Mat rvecs, tvecs; // stores rotation and translation for each camera, not needed now.
249  double rms = cv::fisheye::calibrate(corners3D, corners2D, imgSize, camMatrix, coeffs, rvecs, tvecs, flags);
250  printf("rms error: %lf\n", rms);
251  }
252 
253  // Output calibration result.
254  printf("Fisheye coefficients: %lf %lf %lf %lf\n", coeffs[0], coeffs[1], coeffs[2], coeffs[3]);
255 
256  printf("Camera matrix:\n");
257  printf("[%lf %lf %lf; %lf %lf %lf; %lf %lf %lf]\n", camMatrix(0, 0), camMatrix(0, 1), camMatrix(0, 2),
258  camMatrix(1, 0), camMatrix(1, 1), camMatrix(1, 2), camMatrix(2, 0), camMatrix(2, 1), camMatrix(2, 2));
259 
260  // Now use VPI to undistort the input images:
261 
262  // Allocate a dense map.
263  VPIWarpMap map = {};
264  map.grid.numHorizRegions = 1;
265  map.grid.numVertRegions = 1;
266  map.grid.regionWidth[0] = imgSize.width;
267  map.grid.regionHeight[0] = imgSize.height;
268  map.grid.horizInterval[0] = 1;
269  map.grid.vertInterval[0] = 1;
270  CHECK_STATUS(vpiWarpMapAllocData(&map));
271 
272  // Initialize the fisheye lens model with the coefficients given by calibration procedure.
273  VPIFisheyeLensDistortionModel distModel = {};
274  distModel.mapping = VPI_FISHEYE_EQUIDISTANT;
275  distModel.k1 = coeffs[0];
276  distModel.k2 = coeffs[1];
277  distModel.k3 = coeffs[2];
278  distModel.k4 = coeffs[3];
279 
280  // Fill up the camera intrinsic parameters given by camera calibration procedure.
282  for (int i = 0; i < 2; ++i)
283  {
284  for (int j = 0; j < 3; ++j)
285  {
286  K[i][j] = camMatrix(i, j);
287  }
288  }
289 
290  // Camera extrinsics is be identity.
291  VPICameraExtrinsic X = {};
292  X[0][0] = X[1][1] = X[2][2] = 1;
293 
294  // Generate a warp map to undistort an image taken from fisheye lens with
295  // given parameters calculated above.
296  vpiWarpMapGenerateFromFisheyeLensDistortionModel(K, X, K, &distModel, &map);
297 
298  // Create out a vpi context to store all vpi objects we'll create.
299  CHECK_STATUS(vpiContextCreate(0, &ctx));
300  // Activate it. From now on all created objects will be owned by it.
301  CHECK_STATUS(vpiContextSetCurrent(ctx));
302 
303  // Create a stream where operations will take place. We're using CUDA
304  // processing.
305  VPIStream stream;
306  CHECK_STATUS(vpiStreamCreate(VPI_BACKEND_CUDA, &stream));
307 
308  // Create the Remap payload for undistortion given the map generated above.
309  VPIPayload remap;
310  CHECK_STATUS(vpiCreateRemap(VPI_BACKEND_CUDA, &map, &remap));
311 
312  // Temporary input and output images in NV12 format.
313  VPIImage tmpIn;
314  CHECK_STATUS(vpiImageCreate(imgSize.width, imgSize.height, VPI_IMAGE_FORMAT_NV12, 0, &tmpIn));
315 
316  VPIImage tmpOut;
317  CHECK_STATUS(vpiImageCreate(imgSize.width, imgSize.height, VPI_IMAGE_FORMAT_NV12, 0, &tmpOut));
318 
319  VPIImage vimg = nullptr;
320 
321  // For each input image,
322  for (unsigned i = 0; i < params.images.size(); ++i)
323  {
324  // Read it from disk.
325  cv::Mat img = cv::imread(params.images[i]);
326  assert(!img.empty());
327 
328  // Wrap it into a VPIImage
329  {
330  VPIImageData imgData;
331  memset(&imgData, 0, sizeof(imgData));
332 
333  assert(img.type() == CV_8UC3);
334  imgData.type = VPI_IMAGE_FORMAT_BGR8;
335 
336  // First fill VPIImageData with the, well, image data...
337  imgData.numPlanes = 1;
338  imgData.planes[0].width = img.cols;
339  imgData.planes[0].height = img.rows;
340  imgData.planes[0].pitchBytes = img.step[0];
341  imgData.planes[0].data = img.data;
342 
343  if (vimg == nullptr)
344  {
345  // Now create a VPIImage that wraps it.
346  CHECK_STATUS(vpiImageCreateHostMemWrapper(&imgData, 0, &vimg));
347  }
348  else
349  {
350  CHECK_STATUS(vpiImageSetWrappedHostMem(vimg, &imgData));
351  }
352  }
353 
354  // Convert BGR -> NV12
355  CHECK_STATUS(
357 
358  // Undistorts the input image.
359  CHECK_STATUS(vpiSubmitRemap(stream, remap, tmpIn, tmpOut, VPI_INTERP_CATMULL_ROM, VPI_BOUNDARY_COND_ZERO));
360 
361  // Convert the result NV12 back to BGR, writing back to the input image.
362  CHECK_STATUS(
364 
365  // Wait until conversion finishes.
366  CHECK_STATUS(vpiStreamSync(stream));
367 
368  // Since vimg is wrapping the OpenCV image, the result is already there.
369  // We just have to save it to disk.
370  char buf[64];
371  snprintf(buf, sizeof(buf), "undistort_%03d.jpg", i);
372  imwrite(buf, img);
373  }
374  }
375  catch (std::exception &e)
376  {
377  std::cerr << "Error: " << e.what() << std::endl;
378  PrintUsage(basename(argv[0]), std::cerr);
379 
380  if (ctx != nullptr)
381  {
382  vpiContextDestroy(ctx);
383  }
384  return 1;
385  }
386 
387  if (ctx != nullptr)
388  {
389  vpiContextDestroy(ctx);
390  }
391  return 0;
392 }
VPIWarpGrid::numHorizRegions
uint8_t numHorizRegions
Number of regions horizontally.
Definition: WarpGrid.h:158
VPIImagePlane::height
uint32_t height
Height of this plane in pixels.
Definition: Image.h:138
VPIContext
struct VPIContextImpl * VPIContext
A handle to a context.
Definition: Types.h:178
VPIWarpGrid::horizInterval
uint16_t horizInterval[VPI_WARPGRID_MAX_HORIZ_REGIONS_COUNT]
Horizontal spacing between control points within a given region.
Definition: WarpGrid.h:163
VPIFisheyeLensDistortionModel::k4
float k4
Definition: LensDistortionModels.h:142
VPIImagePlane::width
uint32_t width
Width of this plane in pixels.
Definition: Image.h:137
VPI_FISHEYE_EQUIDISTANT
@ VPI_FISHEYE_EQUIDISTANT
Specifies the equidistant fisheye mapping.
Definition: LensDistortionModels.h:86
VPI_IMAGE_FORMAT_NV12
@ VPI_IMAGE_FORMAT_NV12
YUV420sp 8-bit pitch-linear format composed of two planes:
Definition: ImageFormat.h:123
vpiStreamCreate
VPIStatus vpiStreamCreate(uint32_t flags, VPIStream *stream)
Create a stream instance.
vpiContextCreate
VPIStatus vpiContextCreate(uint32_t flags, VPIContext *ctx)
Create a context instance.
vpiWarpMapAllocData
VPIStatus vpiWarpMapAllocData(VPIWarpMap *warpMap)
Allocates the warp map's control point array for a given warp grid.
VPIWarpGrid::regionHeight
uint16_t regionHeight[VPI_WARPGRID_MAX_VERT_REGIONS_COUNT]
Height of each region.
Definition: WarpGrid.h:162
vpiContextSetCurrent
VPIStatus vpiContextSetCurrent(VPIContext ctx)
Sets the context for the calling thread.
VPIWarpGrid::vertInterval
uint16_t vertInterval[VPI_WARPGRID_MAX_VERT_REGIONS_COUNT]
Vertical spacing between control points within a given region.
Definition: WarpGrid.h:165
VPI_IMAGE_FORMAT_BGR8
@ VPI_IMAGE_FORMAT_BGR8
Single plane with interleaved BGR 8-bit channel.
Definition: ImageFormat.h:135
LensDistortionModels.h
Declares functions to generate warp maps based on common lens distortion models.
vpiStreamSync
VPIStatus vpiStreamSync(VPIStream stream)
Blocks the calling thread until all submitted commands in this stream queue are done (queue is empty)...
VPI_BACKEND_CUDA
@ VPI_BACKEND_CUDA
CUDA backend.
Definition: Types.h:91
VPIWarpGrid::numVertRegions
uint8_t numVertRegions
Number of regions vertically.
Definition: WarpGrid.h:159
VPIImageData
Stores information about image characteristics and content.
Definition: Image.h:159
VPIStream
struct VPIStreamImpl * VPIStream
A handle to a stream.
Definition: Types.h:190
Remap.h
Declares functions that implement the Remap algorithm.
vpiContextDestroy
void vpiContextDestroy(VPIContext ctx)
Destroy a context instance as well as all resources it owns.
VPIImageData::planes
VPIImagePlane planes[VPI_MAX_PLANE_COUNT]
Data of all image planes.
Definition: Image.h:166
VPICameraExtrinsic
float VPICameraExtrinsic[3][4]
Camera extrinsic matrix.
Definition: Types.h:491
VPIWarpMap::grid
VPIWarpGrid grid
Warp grid control point structure definition.
Definition: WarpMap.h:91
VPIFisheyeLensDistortionModel::k3
float k3
Definition: LensDistortionModels.h:142
vpiImageCreate
VPIStatus vpiImageCreate(uint32_t width, uint32_t height, VPIImageFormat fmt, uint32_t flags, VPIImage *img)
Create an empty image instance with the specified flags.
Image.h
Functions and structures for dealing with VPI images.
vpiCreateRemap
VPIStatus vpiCreateRemap(VPIBackend backend, const VPIWarpMap *warpMap, VPIPayload *payload)
Create a payload for Remap algorithm.
VPI_INTERP_CATMULL_ROM
@ VPI_INTERP_CATMULL_ROM
Alias to fast Catmull-Rom cubic interpolator.
Definition: Types.h:321
VPIImagePlane::pitchBytes
uint32_t pitchBytes
Difference in bytes of beginning of one row and the beginning of the previous.
Definition: Image.h:139
VPIImage
struct VPIImageImpl * VPIImage
A handle to an image.
Definition: Types.h:196
VPIImageData::numPlanes
int32_t numPlanes
Number of planes.
Definition: Image.h:161
VPIWarpGrid::regionWidth
uint16_t regionWidth[VPI_WARPGRID_MAX_HORIZ_REGIONS_COUNT]
Width of each region.
Definition: WarpGrid.h:161
VPI_BOUNDARY_COND_ZERO
@ VPI_BOUNDARY_COND_ZERO
All pixels outside the image are considered to be zero.
Definition: Types.h:218
vpiSubmitConvertImageFormat
VPIStatus vpiSubmitConvertImageFormat(VPIStream stream, VPIBackend backend, VPIImage input, VPIImage output, VPIConversionPolicy convPolicy, float scale, float offset)
Converts the image contents to the desired format, with optional scaling and offset.
VPIPayload
struct VPIPayloadImpl * VPIPayload
A handle to an algorithm payload.
Definition: Types.h:208
VPIImageData::type
VPIImageFormat type
Image type.
Definition: Image.h:160
VPIFisheyeLensDistortionModel
Holds coefficients for fisheye lens distortion model.
Definition: LensDistortionModels.h:138
Status.h
Declaration of VPI status codes handling functions.
ConvertImageFormat.h
Declares functions that handle image format conversion.
vpiSubmitRemap
VPIStatus vpiSubmitRemap(VPIStream stream, VPIPayload payload, VPIImage input, VPIImage output, VPIInterpolationType interp, VPIBoundaryCond bcond)
Submits the Remap operation to the stream associated with the payload.
vpiWarpMapGenerateFromFisheyeLensDistortionModel
VPIStatus vpiWarpMapGenerateFromFisheyeLensDistortionModel(const VPICameraIntrinsic Kin, const VPICameraExtrinsic X, const VPICameraIntrinsic Kout, const VPIFisheyeLensDistortionModel *distModel, VPIWarpMap *warpMap)
Generates a mapping that corrects image distortions caused by fisheye lenses.
vpiImageCreateHostMemWrapper
VPIStatus vpiImageCreateHostMemWrapper(const VPIImageData *hostData, uint32_t flags, VPIImage *img)
Create an image object by wrapping around an existing host memory block.
Stream.h
Declares functions dealing with VPI streams.
vpiImageSetWrappedHostMem
VPIStatus vpiImageSetWrappedHostMem(VPIImage img, const VPIImageData *hostData)
Redefines the wrapped host memory in an existing VPIImage wrapper.
VPIImagePlane::data
void * data
Pointer to the first row of this plane.
Definition: Image.h:147
VPIWarpMap
Defines the mapping between input and output images' pixels.
Definition: WarpMap.h:88
VPIFisheyeLensDistortionModel::k2
float k2
Definition: LensDistortionModels.h:142
VPI_CONVERSION_CLAMP
@ VPI_CONVERSION_CLAMP
Clamps input to output's type range.
Definition: Types.h:343
VPIFisheyeLensDistortionModel::k1
float k1
Definition: LensDistortionModels.h:142
VPICameraIntrinsic
float VPICameraIntrinsic[2][3]
Camera intrinsic matrix.
Definition: Types.h:478
VPIFisheyeLensDistortionModel::mapping
VPIFisheyeMapping mapping
Mapping between pixel angle and pixel distance to image center.
Definition: LensDistortionModels.h:139
Context.h
Functions and structures for dealing with VPI contexts.