Manipulation Sample Applications

This sample provides interactive joint control using Jupyter Notebook. It’s a great starting point for working with the CompositeProto message used for manipulation components, including the LQR planner.

In the Isaac SDK repository, run the simple_joint_control Jupyter notebook app:

Copy
Copied!
            

bob@desktop:~/isaac/sdk$ bazel run apps/samples/manipulation:simple_joint_control


Your web browser should open the Jupyter notebook document. If it does not, search for a link on the console: It will look like http://localhost:8888/notebooks/simple_joint_control.ipynb. Open that link in your browser.

This sample has two parts. The first part, UR10 in Omniverse, controls a simulated UR10 arm in Isaac Sim in NVIDIA Omniverse™. The second part, Kinova Jaco Hardware, controls the Kinova generation 2 robotic arm hardware.

  • UR10 in Omniverse: Follow the documentation Isaac Sim built on NVIDIA Omniverse™ to start the simulator and open the stage at omni:/Isaac/Samples/Isaac_SDK/Scenario/ur10_basic.usd. Start the simulation and Robot Engine Bridge.

    In the Jupyter notebook, follow the cells to start the SDK application using either LQR planner or RMP planner. Once it is connected to the simulator, you can use the sliders to move individual joints of the UR10 arm in simulation. The joint commands to reach the target joint angles are computed by the multi_joint_lqr_control or multi_joint_rmp_control subgraph, respectively.

simple_joint_control1.png

Note

The ur10 model has self-collision disabled by default in simulation. To enable this, select ur10 in the Stage and, in the Property tab under Articulation Root, check the Enabled Self Collision box. Note the LQR planner does not avoid self-collision.

  • Kinova Jaco Hardware: Follow the instructions in the notebook to install the Kinova API and connect the arm to the workstation. Follow the cells to start the SDK application, and use the sliders to move individual joints on the Kinova arm.

This sample shows how to make the robotic arm move through a set of predefined waypoints in joint angles and control the end effector (in this case, a gripper). The CompositeMetric and CompositeAtlas documentation explains in detail how to make the arm follow a predefined path.

Follow the documentation for Isaac Sim built on Omniverse to start the simulator and open the stage at omni:/Isaac/Samples/Isaac_SDK/Scenario/sortbot_sim.usd. Start the simulation and Robot Engine Bridge.

In the Isaac SDK repository, run the following:

Copy
Copied!
            

bob@desktop:~/isaac/sdk$ bazel run apps/samples/manipulation:shuffle_box


The UR10 arm should repetitively pick up a pink box on one dolly and drop it off on the other side. Since the pick-up and drop-off waypoints are hard-coded joint angles, while the box position will shift slightly over time, after a number of iterations the arm will not be able to pick up the box due to misalignment.

shuffle_box_perspective1.jpg

The sample app also includes a pre-trained 3D-pose estimation model for the pink box. The perception output can be visualized in Sight:

  1. Switch the camera in simulation next to the Settings icon from Perspective to Camera / Camera .
  2. Open Sight at http://localhost:3000. You should see the CAD model of the box with the detected 3D pose overlayed on the RGB camera image.
shuffle_box_detection1.jpg

© Copyright 2018-2020, NVIDIA Corporation. Last updated on Oct 30, 2023.