Bring your own Transforms
If you are bringing your transforms, then you can either:
Load the transforms as pip package hence they will be available to AIAA Server.
Copy the transforms into
<workspace>/lib
and AIAA can pick up from this path.Copy the transforms into someplace and configure PYTHONPATH before AIAA Server is started.
When you are using custom transforms, make sure in config.json you either:
provide the full class name of the transform.
use short-name but provide the class path to the transform. For e.g.
"path": "mytransforms.StatisticalNormalization"
Please read Clara Training Framework: Bring your own Transformation and Clara Training Framework: Bring your own Data Loader.
The same transformation/data loader can be used both in training and AIAA.
import numpy as np
# note the ai4med here
from ai4med.common.medical_image import MedicalImage
from ai4med.common.transform_ctx import TransformContext
from ai4med.components.transforms.transformer import Transformer
class FilterProbabilityThreshold(Transformer):
def __init__(self, label_field='model', threshold=0.5):
Transformer.__init__(self)
self._label_field = label_field
self._threshold = threshold
def transform(self, transform_ctx: TransformContext):
label = transform_ctx.get_image(self._label_field)
result = (np.squeeze(label.get_data()) > self._threshold).astype(np.uint8)
m = label.new_image(result, label.get_shape_format())
transform_ctx.set_image(self._label_field, m)
return transform_ctx
If you save this transform in a file called mytransforms.py
and put it inside <workspace>/lib
folder.
Then you can use it in config_aiaa.json as the following:
{
"name": "mytransforms.FilterProbabilityThreshold",
"args": {
"label_field": "model",
"threshold": 0.5
}
}
OR
{
"name": "FilterProbabilityThreshold",
"path": "mytransforms.FilterProbabilityThreshold",
"args": {
"label_field": "model",
"threshold": 0.5
}
}
Note that we are using “data[key]” to get and set the data.
This is different from the previous approach which uses the MedicalImage
and TranformContext
.
import numpy as np
class LabelNPSqueeze(object):
def __init__(self, label_in='model', label_out='model', dtype='uint8'):
self.key_label_in = label_in
self.key_label_out = label_out
self.dtype = dtype
def __call__(self, data):
label = data[self.key_label_in]
label = np.squeeze(label).astype(self.dtype)
data[self.key_label_out] = label
return data
If you save this transform in a file called mytransforms.py
and put it inside <workspace>/lib
folder.
Then you can use it in config_aiaa.json as the following:
{
"name": "mytransforms.LabelNPSqueeze",
"args": {
"label_in": "model",
"label_out": "model"
}
}
Simple Dictionary Based Transformer can not be used in training in Clara.
If you are writing a simple transformer, you might want to first look for some very good and
simple features/transforms from Clara Train API
.
You can find more details in Clara Train API
documents on how to write your own Transforms.
An example for BYOM model config can be found in the container at: /opt/nvidia/medical/nvmidl/apps/aas/tests/byom/config_aiaa.json.