Introduction

Discover the NVIDIA DGX Cloud Lepton AI platform for development, with endpoints, dev pods, batch jobs, and enterprise solutions.

DGX Cloud Lepton is a fully managed AI platform designed for developing, training, and deploying AI models. It offers production-grade performance, cost efficiency, comprehensive ML tooling, and flexible GPU options backed by enterprise SLAs.

Key Features and Benefits

  • Node Groups: Create node groups to manage compute resources and run workloads.
  • Endpoints: Deploy AI models as endpoints, with high performance and scalability.
  • Dev Pods: Run interactive development sessions—including SSH, Jupyter notebooks, and VS Code—with managed GPUs in the cloud.
  • Batch Jobs: Run distributed training or batch processing jobs with high-performance interconnects and accelerated storage.
  • Compute: Use managed infrastructure or bring your own—and manage both with ease.
  • Bring Your Own Compute: Connect your own infrastructure to DGX Cloud Lepton to supercharge your cloud experience on any infrastructure.
  • Features: Built-in storage, observability, and many configurable options.
  • Workspace: Configure your workspace settings, including members, tokens, secrets, registry, and more.

Next Steps

Follow quick start guides for Workspace, Node Group, Endpoint, Dev Pod, and Batch Job—or explore the examples.

Copyright @ 2025, NVIDIA Corporation.