Create Job#
Prerequisites#
Before you can create a customization job, make sure that you have:
Obtained the base URL of your NeMo Customizer service.
Obtained a list of customization configurations to find the configuration you want to use.
Determined the hyperparameters you want to use for the customization job.
Create a Customization Job#
API#
Perform a POST request to the
/v1/customizer/jobs
endpoint.curl -X POST \ "https://${CUSTOMIZER_HOSTNAME}/v1/customization/jobs" \ -H 'accept: application/json' \ -H 'Content-Type: application/json' \ -H 'wandb-api-key: <YOUR_WANDB_API_KEY>' \ -d '{ "name": "<NAME>", "description": "<DESCRIPTION>", "project": "<PROJECT_NAME>", "config": "<CONFIG_NAME>", "hyperparameters": { "finetuning_type": "lora", "training_type": "sft", "batch_size": 8, "epochs": 50, "learning_rate": 0.0001, "log_every_n_steps": 0, "val_check_interval": 0.01, "weight_decay": 0, "sft": { "hidden_dropout": 1, "attention_dropout": 1, "ffn_dropout": 1 }, "lora": { "adapter_dim": 8, "adapter_dropout": 1 } }, "output_model": "<OUTPUT_MODEL_NAME>", "dataset": "<DATASET_NAME>", "ownership": { "created_by": "", "access_policies": {} } }' | jq
Review the returned customization job.
Example Response
{ "id": "cust-JGTaMbJMdqjJU8WbQdN9Q2", "created_at": "2024-12-09T04:06:28.542884", "updated_at": "2024-12-09T04:06:28.542884", "config": { "schema_version": "1.0", "id": "af783f5b-d985-4e5b-bbb7-f9eec39cc0b1", "created_at": "2024-12-09T04:06:28.542657", "updated_at": "2024-12-09T04:06:28.569837", "custom_fields": {}, "name": "meta/llama-3_1-8b-instruct", "base_model": "meta/llama-3_1-8b-instruct", "model_path": "llama-3_1-8b-instruct", "training_types": [], "finetuning_types": [ "lora" ], "precision": "bf16", "num_gpus": 4, "num_nodes": 1, "micro_batch_size": 1, "tensor_parallel_size": 1, "max_seq_length": 4096 }, "dataset": { "schema_version": "1.0", "id": "dataset-XU4pvGzr5tvawnbVxeJMTb", "created_at": "2024-12-09T04:06:28.542657", "updated_at": "2024-12-09T04:06:28.542660", "custom_fields": {}, "name": "default/sample-basic-test", "version_id": "main", "version_tags": [] }, "hyperparameters": { "finetuning_type": "lora", "training_type": "sft", "batch_size": 16, "epochs": 10, "learning_rate": 0.0001, "lora": { "adapter_dim": 16 } }, "output_model": "test-example-model@v1", "status": "created", "project": "test-project", "custom_fields": {}, "ownership": { "created_by": "me", "access_policies": { "arbitrary": "json" } } }