Troubleshooting and FAQs

How can I access ``nvcr.io``?

nvcr.io is NVIDIA’s Docker container registry. You can access it with the NGC credentials you received from your NVIDIA representative. First follow these instructions to install Docker on your system. The instructions mention multiple ways to install Docker. We recommend installing Docker using the apt repository.

Please make sure to also follow the optional steps to run docker without sudo from here.

Next follow the instructions here to access nvcr.io.

What is ``<your_staging_area>``?

<your_staging_area> is the NGC Registry staging area code that has been assigned to you by your NVIDIA representative. You need this code to access any Docker containers or other assets stored on NGC. If you do not have an NGC staging area code, please contact your NVIDIA representative.

Jetson Won’t Connect to the Monitor Unless Ethernet Is Connected

This is a known bug. We recommend that you have the Carter robot connected to ethernet during the initial SW setup procedure in the documentation.

I Got Disconnected from Isaac Sight during Data Recording

  • You just got disconnected from the recorder app with the message “You have been disconnected: Maximum number of connections already reached”. What should you do?

  • The robot is designed to have only one connection at a time. So, it might be the case that someone else tried to use the same Nova Carter robot as you.

  • To disconnect other users, right-click the NVIDIA logo in the corner, you can “Disconnect Others” and regain access to the UI

    sight_disconnect_others.png

While recording, Is the Data Divided into Multiple POD Files?

Data recordings are split every 8 minutes to keep file sizes manageable. This also prevents losing large amounts of data due to file corruption. The maximum recording length can be overridden with the following command line flag:

Copy
Copied!
            

--param=recorder.pod_recorder/pod_recorder/max_duration=<value>

Note

that max_duration is specified in seconds.

What’s the Difference between COMPLETED and Ready (in the Recorder App)?

recorder_completed_vs_ready.png

“COMPLETED” means the file uploaded successfully.

“Ready” means the file is ready to be uploaded.

Does the Recorder App need to be running for data validation?

Once data has been collected and uploaded for data validation, the recorder app can be closed and Carter can be powered off or used for other tasks.

I am seeing a lot of data loss from the lidar

This can occur if there is not enough space left on the main drive. Check the remaining space with df -h /dev/mmcblk0p1 and remove uncessary files if Use% is at or above 99%.

If the issue persists, try opening Sight in an Incognito window.

My robot is not going to the location I intend.

The default behavior is for the origin to be in the top left, and for X to reference rows and Y to reference columns. Mission Control by default works with world == map frame. This can be adjusted in the Waypoint Graph Generator configuration via translation and rotation.

How do I perform per-service debugging?

Refer to the following local pages:

The routes aren’t what I expect.

  1. Load the Waypoint Graph Generator FastAPI docs (http://localhost:8000/v1/docs))

  2. Use the GET /graph/visualize endpoint with the map_id from your defaults.yaml file.

cuOpt will then optimize the overall route based on nodes in the graph.

After aborting a mission, I can no longer submit missions to that robot.

If a mission is started and never completed, your mission or robot may end up in a unrecoverable state. Depending on the expiry, this may not resolve itself. You can refer to the Mission Dispatch FastAPI docs (http://localhost:5002/docs) to delete a mission in progress, or purge the postgres instance with the following command:

Copy
Copied!
            

docker container rm external-postgres-1


Can I turn on debug mode?

In bringup_isaac_cloud.yaml, modify Mission Control verbose mode by changing Debug mode from INFO to DEBUG. This can be helpful in debugging certain issues.

Navigational commands are out of bounds

A common issue is giving a target location that is a non-navigable surface (e.g. an X/Y in the middle of a wall). This will result in a mission failure.

I am unsure whether my robot is connected.

In the Mission Dispatch FastAPI page (http://localhost:5002/docs), in the robot section, select “GET /robot”, “Try it out”, and “Execute”. If you don’t see your robot there, either Mission Control did not create the robot because the name is different, or your robot is not communicating with the MQTT channel.

I encounter the error message ``”AuthenticationService disabled by empty vehicle token,”``

Ensure that you set the vehicle_token. Additionally, if you come across the error “security requirements failed: error getting JWT from Authorization header: InvalidArgument,” verify that your vehicle_token is correct.

I encounter ab ``Address in Use`` Error with Mosquitto Daemon.

If you encounter the error message tutorials-mosquitto-1 Error: Address in use, it is likely because another Mosquitto daemon is already running. To resolve this issue, stop the conflicting daemon using the following commands:

Copy
Copied!
            

ps aux | grep mosquitto sudo systemctl stop mosquitto # mosquitto service name might be different

I encounter the error message ``non-navigable surface``

If you come across this error message, follow these steps to resolve it:

  1. Verify the resolution in the defaults.yaml file to ensure it is correct. The resolution is measured in pixels per meter and should align with the occupancy map you are using.

  2. Make sure that the mission’s designated point is not positioned on an obstruction. For occupancy maps, ensure it is within a non-occupied area, and for semantic maps, ensure it falls on a navigable surface.

  3. Double-check the accuracy of the translation and rotation values in the defaults.yaml file to ensure they are set correctly. The translation and rotation represent the transformation from the map frame to the world frame.

How do I use the Navigation Stack in a custom map?

To use the Navigation Stack in a custom map, you need to pass the map information to the navigation stack using the following CLI args:

Copy
Copied!
            

--omap-path=<path/to/occupancy-grid-map> --omap-cell-size=<cell-size-of-omap-in-meters> --semantic-map-path=<path/to/semantic-map>

In a YAML configuration file, the same options would be as follows:

Copy
Copied!
            

omap_path: <path/to/occupancy-grid-map> omap_cell_size: <cell-size-of-omap-in-meters> semantic_map_path: <path/to/semantic-map>

Isaac sight won’t display anything

When using Isaac Sight, it is recommended to use the Google Chrome or Edge browser.

© Copyright 2018-2023, NVIDIA Corporation. Last updated on Oct 23, 2023.