0.5.0 and above)
List of inference models required by an operator.
Listed models will be loaded by Triton Inference Server and made available to the requesting operator(s) prior to the start of any pipeline job requiring them.
Name of the inference model to loaded by Triton Inference Server.
Model names function as both the unique identifier of the inference model, but also as its file-system name and the name by which the model can be referenced by Triton Inference Client.