Continuous Learning

AIAA supports continuous learning feature. Users can easily load the pretrained models into AIAA server, use the MONAI Label client to annotate unlabelled images, and use those new data to train a better model.

How to run continuous learning

AIAA supports the feature using Clara-Train’s MMAR.

A valid MMAR for continuous learning should contain the following files:

ROOT
    config
        config_train.json
        config_aiaa.json
    commands
        prepare_dataset.sh
        train.sh
        train_multi_gpu.sh
        export.sh

After we start the AIAA server, we can zip the MMAR folder and load it into the AIAA server using the following commands:

# zip the folder
tar -zcvf mmar.tgz [/path/to/your/mmar/folder]

# load the model into AIAA server
curl -X PUT "http://127.0.0.1:$AIAA_PORT/admin/model/mmar_train" \
     -F "data=@mmar.tgz"

Then users can use “POST” request to invoke model training, for example:

curl -X POST "http://127.0.0.1:$AIAA_PORT/admin/train/mmar_train"
A more friendly way would be to install the MONAI Label client,

and interact directly with the client side application.

How to train with both existing and new data

The fine-tuning by default will only run on newly added data. If we want to train with all the data, we can create symbolic link (or copy data) into the AIAA’s workspace/mmars folder.

The links should be created in the following structure:

[AIAA WORKSPACE FOLDER]
    mmars
        [the mmar name]
            dataset
                training
                    images
                        image_1    <-   creates inside here
                        image_2    <-   creates inside here
                        ...
                    labels
                        label_1    <-   creates inside here
                        label_2    <-   creates inside here
                        ...