AprilTags is a popular form of fiducial tagging. It has a wide variety of applications in robotics, including object tracking, visual localization, SLAM accuracy evaluation, and human robot interaction. Isaac provides real-time AprilTag detection by leveraging GPU accelerations while achieving high decoding robustness.
In addition to detection, Isaac also performs tag pose estimation on all detected tags. We compute an estimate of the tag pose from the camera intrinsic parameters, the size of the tag, and the pixel coordinates of the tag corners, returning the rotation and translation of the tag with respect to the camera. Specifically, given the following:
Camera focal lengths in X and Y, in pixels (per radian).
Camera principal point X and Y, in pixels from pixel(0,0) in the image.
Tag size W, where the tag is square W x W, in units convenient to the caller. Meters or centimeters are recommended.
The Isaac SDK returns the following:
Tag IDs for all detected AprilTags in the format
<tagFamily>_<tagID>. For example, for a tag family
7, the tag ID returned is
In the 2018.3 release, the
tag36h11tag family is supported. Extending the algorithm to other tag families is planned for a future release.
The observed pixel coordinates of the tag, starting with the upper left corner, followed by the upper right corner, the lower right corner, and then the lower left corner.
A quaternion representing the orientation of the tag with respect to the camera frame; and
A 3-vector indicating the location of the center of the tag with respect to the location of the camera, in the same units used for specifying the dimensions of the tag.
The coordinate system is relative to the camera is:
X-axis to the right
Column-scanned rotation matrix, i.e. a list of the remappings of the X-, Y- and Z-axes.
The recision of this estimate is inversely proportional to the distance to the tag.
Isaac uses AprilTags detection code in the form of a static library.
AprilTags detection and pose estimation is wrapped as an Isaac codelet, and is available in the Isaac repository.
The Isaac codelet wrapping AprilTags detection takes an input image, and publishes a list of detected tags along with coordinates of the tag corners. It additionally uses the camera intrinsics, the input tag size, and the coordinates of the detected tags, to estimate the pose of those tags. Pose is represented by a quaternion and a translation vector, relative to the location of the camera.
The AprilTags sample application uses a Realsense stereo camera. First connect the camera to the host system or the Jetson platform you are using. Then use one of the following procedures to run the included sample application.
To Run the Sample Application on the Host System
Build the sample application with the following command:
bob@desktop:~/isaac$ bazel build //apps/samples/april_tags
Run the sample application with the following command:
bob@desktop:~/isaac$ bazel run //apps/samples/april_tags
To Run the Application on Jetson
Build a package on the host and then deploy it to the Jetson system.
Deploy //apps/samples/april_tags:april_tags-pkg to the robot as explained in Deploying and Running on Jetson.
Log on to the Jetson system and run the application with the following commands:
bob@jetson:~/$ cd deploy/bob/april_tags-pkg bob@jetson:~/deploy/bob/april_tags-pkg$ ./apps/samples/april_tags/april_tags
Where “bob” is your user name on your host system.
To View Output from the Application in Websight
While the application is running, open Isaac Sight in a browser by
http://localhost:3000. (If running the application on a
Jetson platform, make sure to use the IP address of the Jetson system
instead of localhost.)
In Websight, a window called Tags shows the input image with a green semi-transparent rectangle overlaid on top of detected AprilTags: