Bring your own Transforms

If you are bringing your transforms, then you can either:

  • Load the transforms as pip package hence they will be available to AIAA Server.

  • Copy the transforms into <workspace>/lib and AIAA can pick up from this path.

  • Copy the transforms into someplace and configure PYTHONPATH before AIAA Server is started.


When you are using custom transforms, make sure in config.json you either:

  1. provide the full class name of the transform.

  2. use short-name but provide the class path to the transform. For e.g. "path": "mytransforms.StatisticalNormalization"

Simple Dictionary Based Transformer

Note that we are using “data[key]” to get and set the data. This is different from the previous approach which uses the MedicalImage and TranformContext.

import numpy as np

class LabelNPSqueeze(object):
    def __init__(self, label_in='model', label_out='model', dtype='uint8'):
        self.key_label_in = label_in
        self.key_label_out = label_out
        self.dtype = dtype

    def __call__(self, data):
        label = data[self.key_label_in]
        label = np.squeeze(label).astype(self.dtype)

        data[self.key_label_out] = label
        return data

If you save this transform in a file called and put it inside <workspace>/lib folder. Then you can use it in config_aiaa.json as the following:

  "name": "mytransforms.LabelNPSqueeze",
  "args": {
    "label_in": "model",
    "label_out": "model"


Simple Dictionary Based Transformer can not be used in training in Clara.


If you are writing a simple transformer, you might want to first look for some very good and simple features/transforms from Clara Train API. You can find more details in Clara Train API documents on how to write your own Transforms.


An example for BYOM model config can be found in the container at: /opt/nvidia/medical/nvmidl/apps/aas/tests/byom/config_aiaa.json.