Deployment Guide#

Route Optimization

Overview#

The route optimization workflow features an example of how to use NVIDIA cuOpt as a managed service, with sample data and a Python thin client to show how to interact with this service and send requests to solve for routes.

This deployment guide will walk through the process of accessing the service and running the workflow, including information on the various components used, and if further customization is required.

Note

This example is only for reference and should not be used in production deployments. Production implementations of these workflows should be customized and integrated with your Enterprise-grade infrastructure and software and should be deployed on platforms supported by NVIDIA AI Enterprise.

Note

Each user is responsible for checking the content and the applicable licenses of third-party software and determining if they are suitable for the intended use.

Data Pre-Processing

Upon importing the raw data, we need to build a cost matrix which represents the cost of traversing from one location in the optimization problem to another. In this VRP use case, the cost is the travel time from one location to another. This matrix is what cuOpt uses to assess the quality of a given solution as it seeks to minimize the total cost. The cost matrix is a square matrix of dimension equal to the number of locations in a given problem. There are several external tools and API’s to build this cost matrix. In this workflow, a cost matrix is provided for you. This cost matrix has been built with Esri’s OD Cost Matrix tool and saved in a csv file, which we import with the three datasets mentioned above. The rest of the data preprocessing section includes modeling the data as arrays of integers, which is the input format the cuOpt solver accepts. This includes encoding locations by assigning an index to each location, and converting time windows from UTC timestamp to epoch time. The data is then stored in dictionaries that is sent to the solver through json requests.

Route Mapping

After calling on the cuOpt solver to get the optimized routes, the last steps are getting driving directions and mapping the routes. This can be done using a third party tool. In this workflow, we will do so using Open Source Routing Machine (OSRM). To do so, we parse the solver response which includes the locations of routes, convert the location indices to coordinate points, and then map the routes, which includes driving directions as to how to get from one stop to the next.

Environment Setup#

To run this workflow, make sure you have accomplished steps 1-4 in the Quick Start Guide for the cuOpt cloud service. Once these steps have been completed, proceed to the environment setup steps below.

  1. First, open a new terminal window, and activate the python virtual environment that was previously created from the Quick Start Guide.

    source cuopt-service-env/bin/activate
    
  2. Next, let’s install all the additional libraries and dependencies needed to run the sample notebooks.

  3. Create a file called ‘requirements.txt’ by running the command. This will create and open a new text file in your terminal window.

    nano requirements.txt
    
  4. Copy and paste the following content:

    1jupyterlab
    2pandas
    3requests
    4folium
    5polyline
    6scipy
    7matplotlib
    8wheel
    9ipywidgets
    

    Important

    These are all the libraries and dependencies needed to run the workflow notebooks.

  5. Save and exit the file.

  6. To install everything listed in this file, run the following command:

    pip install -r requirements.txt
    

    Note

    This is equivalent to running a ‘pip install’ command for each requirement.

Run the Notebooks#

  1. Download the notebooks from the NGC Catalog

    • You can either download them directly through the browser or NGC CLI.

      Note

      • To use NGC CLI, follow the setup instructions here

      • To set up the CLI you will need an NGC API key. To set up your API key, follow the instructions here.

  2. Unzip the downloaded file into the current working directory.

  3. Launch JupyterLab and run the notebooks. To do so, run the following command:

    jupyter lab --ip=0.0.0.0 --no-browser --allow-root --NotebookApp.token='' --NotebookApp.base_url='/notebook/'
    
  4. In your Jupyter environment, you will see the following directory:

  5. Enter the ‘route-optimization-service’ folder to see the sample notebooks and datasets. Follow the instructions in each notebook to go through the route optimization workflow for each use case.

    Note

    You can use the Shift + Enter key combination to execute a cell in the notebook.