NVIDIA Clara Train 4.1
1.0

Model Fine-tune

If you are using Clara-Train to train your model, AIAA provides functionality to let you fine-tune your trained models based on new annotation samples you make.

AIAA will take care of fine-tuning the model and re-load it back to AIAA for serving.

Note

This is an Admin API.

Attention

If you have only 1 GPU in your system, then AIAA can’t serve inference through segmentation/annotation APIs during fine-tuning.

You need to make sure you put new samples in /workspace/samples and have your MMAR in /workspace/mmars

Hint

To see an example of MMAR check nvmidl/mmar:Medical Model Archive (MMAR).

Below is an example directory for clara_ct_seg_spleen_amp model

Copy
Copied!
            

workspace/ samples/ clara_pt_spleen_ct_segmentation/ options.conf #optional dataset.json #optional images/ spleen_2.nii.gz spleen_3.nii.gz ... labels/ spleen_2.nii.gz spleen_3.nii.gz ... mmars/ clara_pt_spleen_ct_segmentation/ commands/ train_finetune.sh configs/ ...

The uri to call is /admin/finetune/[model]. Note that you can pass all options that nvmidl/mmar:train.sh takes. There are two ways of passing those options, one is via curl, which is demonstrated below.

Copy
Copied!
            

# basic call curl -X POST "http://127.0.0.1:$AIAA_PORT/admin/finetune/clara_ct_seg_spleen_amp" # fine-tune for 5 epochs, %3D means space curl -X POST "http://127.0.0.1:$AIAA_PORT/admin/finetune/clara_ct_seg_spleen_amp?options=epochs%3D5"

The other is to edit the options.conf file. An example is below. You should put this file inside your /workspace/samples/[model]/options.conf.

Copy
Copied!
            

epochs=5 learning_rate=0.00001

Hint

If you want to pick up specific images for training and validation, then you can provide your own configuration at: /workspace/samples/{model}/dataset.json.

You can also set automatic fine-tuning by adding flag --fine_tune true when starting AIAA. AIAA will run model fine-tune on the fine_tune_hour every day for all the models.

© Copyright 2020, NVIDIA. Last updated on Feb 2, 2023.