Loaders¶
Module: polygraphy.tools.args
-
class
OnnxInferShapesArgs
(default: bool = None, allow_force_fallback: bool = None)[source]¶ Bases:
polygraphy.tools.args.base.BaseArgs
ONNX Shape Inference: ONNX shape inference.
Depends on:
OnnxLoadArgs
DataLoaderArgs: if allow_force_fallback == True
- Parameters
default (bool) – Whether shape inference should be enabled by default. Defaults to False.
allow_force_fallback (bool) – Whether fallback shape inference using ONNX-Runtime should be allowed. Defaults to False.
-
parse_impl
(args)[source]¶ Parses command-line arguments and populates the following attributes:
-
do_shape_inference
¶ Whether to do shape inference.
- Type
bool
-
force_fallback
¶ Whether to force fallback shape inference.
- Type
bool
-
-
add_to_script_impl
(script, loader_name)[source]¶ Note that this method does not take fallback shape inference into account. To support fallback shape inference, the tool must call fallback_inference() manually.
- Parameters
loader_name (str) – The name of the loader which should be consumed by the
InferShapes
loader.- Returns
The name of the
InferShapes
loader added to the script.- Return type
str
-
fallback_inference
(onnx_model)[source]¶ Run inference with ONNX-Runtime.
This can be used to retrieve values/shapes/data types for all tensors in the model when other shape inference approaches fail.
- Parameters
onnx_model (onnx.ModelProto) – The ONNX model in which to infer shapes.
- Returns
A tuple containing two elements: 1. Mapping of values for all tensors in the model, including inputs. 2. Metadata for every tensor in the model.
- Return type
-
class
OnnxSaveArgs
(allow_shape_inference: bool = None, output_opt: str = None, output_short_opt: str = None, output_opt_required: bool = None, output_default_path: str = None, allow_multiple_models: bool = None)[source]¶ Bases:
polygraphy.tools.args.base.BaseArgs
ONNX Model Saving: saving ONNX models.
Depends on:
OnnxInferShapesArgs: if allow_shape_inference == True
- Parameters
allow_shape_inference (bool) – Whether to allow shape inference when saving models. Defaults to False.
output_opt (str) – The name of the output path option. Defaults to “output”. Use a value of
False
to disable the option.output_short_opt (str) – The short option to use for the output path. Defaults to “-o”. Use a value of
False
to disable the short option.output_opt_required (bool) – Whether the output path is a required argument. Defaults to False.
output_default_path (str) – The default value to use for the output path option. Defaults to None.
allow_multiple_models (bool) – Whether to enable support for saving more than one model. If this is True, the output path is expected to be a directory. Defaults to False.
-
parse_impl
(args)[source]¶ Parses command-line arguments and populates the following attributes:
-
path
¶ The path at which to save the ONNX model.
- Type
str
-
external_data_path
¶ The path at which to save external data.
- Type
str
-
size_threshold
¶ The size threshold above which external data is saved.
- Type
int
-
all_tensors_to_one_file
¶ Whether all external data should be written to a single file.
- Type
bool
-
-
add_to_script_impl
(script, loader_name)[source]¶ - Parameters
loader_name (str) – The name of the loader which should be consumed by the
SaveOnnx
loader.- Returns
The name of the
SaveOnnx
loader added to the script.- Return type
str
-
save_onnx
(model, path: str = None)[source]¶ Saves an ONNX model according to arguments provided on the command-line.
- Parameters
model (onnx.ModelProto) – The ONNX model to save.
path (str) – The path at which to save the model. If no path is provided, it is determined from command-line arguments.
- Returns
The model that was saved.
- Return type
onnx.ModelProto
-
class
OnnxLoadArgs
(allow_saving: bool = None, outputs_opt_prefix: str = None, allow_shape_inference: bool = None, allow_from_tf: bool = None)[source]¶ Bases:
polygraphy.tools.args.base.BaseArgs
ONNX Model Loading: loading ONNX models.
Depends on:
ModelArgs
OnnxSaveArgs: if allow_saving == True
OnnxInferShapesArgs: if allow_shape_inference == True
OnnxFromTfArgs: if allow_from_tf == True
- Parameters
allow_saving (bool) – Whether to allow loaded models to be saved. Defaults to False.
outputs_opt_prefix (str) – The prefix to use for the outputs option, which controls which tensors are marked as outputs. Defaults to “onnx-“. Use a value of
False
to disable the option.allow_shape_inference (bool) – Whether to allow shape inference when saving models. Defaults to True.
allow_from_tf (bool) – Whether to allow conversion of TensorFlow models to ONNX. Defaults to False.
-
parse_impl
(args)[source]¶ Parses command-line arguments and populates the following attributes:
-
outputs
¶ Names of output tensors.
- Type
List[str]
-
exclude_outputs
¶ Names of tensors which should be unmarked as outputs.
- Type
List[str]
-
external_data_dir
¶ Path to a directory from which to load external data.
- Type
str
-
-
add_to_script_impl
(script, disable_custom_outputs: bool = None, serialize_model: bool = None)[source]¶ - Parameters
disable_custom_outputs (bool) – Whether to disallow modifying outputs according to the outputs and exclude_outputs attributes. Defaults to False.
serialize_model (bool) – Whether to serialize the model. Defaults to False.
- Returns
The name of the ONNX loader added in the script.
- Return type
str
-
must_use_onnx_loader
(disable_custom_outputs: bool = None)[source]¶ Whether this model needs to be loaded via a Polygraphy ONNX loader, e.g., in case it needs modifications.
- Parameters
disable_custom_outputs (bool) – Whether to disallow modifying outputs according to the outputs and exclude_outputs attributes.
- Returns
bool
-
class
OnnxFromTfArgs
[source]¶ Bases:
polygraphy.tools.args.base.BaseArgs
TensorFlow-ONNX Model Conversion: converting TensorFlow models to ONNX.
Depends on:
TfLoadArgs