LLM Inference Quick Start Recipes

Optimized deployment guides for NVIDIA hardware for the most popular open source LLMs.