NVIDIA NIM for DiffDock# DiffDock Overview Training Dataset Release Notes Release 2.2.0 Summary Key Features Release 2.1.0 Summary Release 2.0.1 Summary Release 1.2.0 Summary Release 1.1.0 Summary Release 1.0.0 Summary Support Matrix Supported Hardware Minimum System Hardware Requirements Supported NVIDIA GPUs Testing Locally Available Hardware Getting Started Prerequisites NGC (NVIDIA GPU Cloud) Account NGC CLI Tool Launch DiffDock NIM Run Inference Dump Generated Poses Stopping the Container Configure NIM View NIM Container Information Pull the container image Docker NGC Runtime Parameters for the Container Run Multiple Instances on the Same Host Model Checkpoint Caching Logging NIM Telemetry Benefits Configuration API Reference Advanced Usage Run Inference with Bash Script Run Inference for Batch-Docking Batch-Docking using SMILES Performance Accuracy Benchmarks Speed Benchmarks Benchmark Hardware Benchmark Data EULA