Installation#

Follow these steps to clone the Cosmos Predict-2.5 repository from GitHub, install dependencies, and run the Docker container.

Note

Ensure you have the necessary hardware and software prerequisites before installation.

Setup Cosmos-Predict2.5#

  1. Clone the repository:

    git clone git@github.com:nvidia-cosmos/cosmos-predict2.5.git
    cd cosmos-predict2.5
    
  2. Install the uv system dependency:

    curl -LsSf https://astral.sh/uv/install.sh | sh
    source $HOME/.local/bin/env
    
  3. Install the package into a new environment:

    uv sync
    source .venv/bin/activate
    

    Alternatively, you can install the package into the active environment (e.g. conda):

    uv sync --active --inexact
    
  4. Install optional dependencies:

    uv sync --all-groups
    

Download Checkpoints#

  1. Get a Hugging Face Access Token with Read permissions.

  2. Install the Hugging Face CLI

    uv tool install -U "huggingface_hub[cli]"
    
  3. Log in to Hugging Face with the access token:

    hf auth login
    
  4. Accept the NVIDIA Open Model License Agreement.

Checkpoints are automatically downloaded during inference and post-training. To modify the checkpoint cache location, set the HF_HOME environment variable.

Setup Docker#

Ensure you have access to Docker on your machine. To avoid running out of file descriptors when building the container, increase the limit using the –ulimit nofile option, as shown in the example below.

  • Example build command:

    docker build --ulimit nofile=131071:131071 -f Dockerfile . -t cosmos-predict-2.5
    
  • Example run command:

    docker run --gpus all --rm -v .:/workspace -v /workspace/.venv -it cosmos-predict-2.5
    

Next Steps#

If you haven’t done so already, determine which model you want to work with–refer to the Model Matrix page for more information.

You can then move on to the Predict2.5 Quickstart Guide.