ai4med.components.model_loaders package

class CheckpointLoader(checkpoint_dir, input_node_names=None, output_node_names=None, checkpoint_file_prefix='model.ckpt', infer_in_training_mode=False)

Bases: ai4med.components.model_loaders.tf_model_loader.TFModelLoader

Loads a TF1.x checkpoint format model.

Parameters
  • checkpoint_dir (str) – path to checkpoint

  • input_node_names (dict) – A dict of input names (string) map to input tensors (TF Tensor)

  • output_node_names (dict) – A dict of output names (string) map to output tensors (TF Tensor)

  • checkpoint_file_prefix (str) – prefix of the checkpoint file

  • infer_in_training_mode – whether this model will be set in training mode in inference

load_graph()

Loads a model in frozen graph format

Returns

A TFInferenceContext instance

set_checkpoint_dir(checkpoint_dir)
class FrozenGraphModelLoader(model_file_path, input_node_names=None, output_node_names=None, infer_in_training_mode=False)

Bases: ai4med.components.model_loaders.tf_model_loader.TFModelLoader

Loads a TF1.x frozen graph

Parameters
  • model_file_path (str) – path to frozen_graph.pb

  • input_node_names (dict) – A dict of input names (string) map to input tensors (TF Tensor)

  • output_node_names (dict) – A dict of output names (string) map to output tensors (TF Tensor)

  • infer_in_training_mode – whether this model will be set in training mode in inference

load_graph()

Loads a model in frozen graph format

Returns

A TFInferenceContext instance

class ModelLoader

Bases: abc.ABC

Model loader loads a pre-trained model from its source file and returns an appropriate inference context.

abstract load()

Loads a pre-trained model

Returns

InferenceContext to be used for inference against this model

class TFModelLoader(input_node_names=None, output_node_names=None, infer_in_training_mode=False)

Bases: ai4med.components.model_loaders.model_loader.ModelLoader

Loads a TF1.x model

Parameters
  • input_node_names (dict) – A dict of input names (string) map to input tensors (TF Tensor)

  • output_node_names (dict) – A dict of output names (string) map to output tensors (TF Tensor)

  • infer_in_training_mode – whether this model will be set in training mode in inference

load()

Loads a model

Returns

A TFInferenceContext instance

load_graph()

Load a TF graph and return session and the loader graph (checkpoint or frozen graph). This method must be implemented by sub-class

© Copyright 2020, NVIDIA. Last updated on Feb 2, 2023.