Get Customization Configuration Details#
Get a customization configuration and its details.
Prerequisites#
Before you can get a customization’s details, make sure that you have:
- Access to the NeMo Customizer service 
- Set the - CUSTOMIZER_BASE_URLenvironment variable to your NeMo Customizer service endpoint
export CUSTOMIZER_BASE_URL="https://your-customizer-service-url"
To Get Customization Configuration Details#
Choose one of the following options to get a customization configuration and its details.
import os
from nemo_microservices import NeMoMicroservices
# Initialize the client
client = NeMoMicroservices(
    base_url=os.environ['CUSTOMIZER_BASE_URL']
)
# Get customization config details
config = client.customization.configs.retrieve(
    config_name="llama-3.1-8b-instruct@2.0",
    namespace="default"
)
print(f"Config name: {config.name}")
print(f"Description: {config.description}")
print(f"Target: {config.target}")
print(f"Training options: {len(config.training_options)}")
curl "${CUSTOMIZER_BASE_URL}/v1/customization/configs/default/llama-3.1-8b-instruct@2.0"
Example Response
{
  "id": "customization_config-MedVscVbr4pgLhLgKTLbv9",
  "name": "llama-3.1-8b-instruct@2.0",
  "namespace": "default",
  "description": "Configuration for training LLama 3.1 8B on A100 GPUs",
  "target": {
    "id": "customization_target-AbCdEfGhIjKlMnOpQrStUv",
    "name": "meta/llama-3.1-8b-instruct@2.0",
    "namespace": "default",
    "base_model": "meta/llama-3.1-8b-instruct",
    "enabled": true,
    "num_parameters": 8000000000,
    "precision": "bf16",
    "status": "ready"
  },
  "training_options": [
    {
      "training_type": "sft",
      "finetuning_type": "lora",
      "num_gpus": 2,
      "num_nodes": 1,
      "tensor_parallel_size": 1,
      "pipeline_parallel_size": 1,
      "use_sequence_parallel": false,
      "micro_batch_size": 1
    }
  ],
  "training_precision": "bf16",
  "max_seq_length": 2048,
  "pod_spec": {
    "node_selectors": {
      "nvidia.com/gpu.product": "NVIDIA-A100-80GB"
    },
    "annotations": {
      "sidecar.istio.io/inject": "false"
    },
    "tolerations": [
      {
        "key": "app",
        "operator": "Equal",
        "value": "a100-workload",
        "effect": "NoSchedule"
      }
    ]
  },
  "prompt_template": "{input} {output}",
  "chat_prompt_template": null,
  "dataset_schemas": [],
  "custom_fields": {},
  "project": null,
  "ownership": {},
  "created_at": "2024-01-15T10:30:00.000Z",
  "updated_at": "2024-01-15T10:30:00.000Z"
}