Onnx Parser¶
- class tensorrt.OnnxParser(self: tensorrt.tensorrt.OnnxParser, network: tensorrt.tensorrt.INetworkDefinition, logger: tensorrt.tensorrt.ILogger)¶
This class is used for parsing ONNX models into a TensorRT network definition
- Variables:
num_errors –
int
The number of errors that occurred during prior calls toparse()
- Parameters:
network – The network definition to which the parser will write.
logger – The logger to use.
- __del__(self: tensorrt.tensorrt.OnnxParser) None ¶
- __exit__(exc_type, exc_value, traceback)¶
Context managers are deprecated and have no effect. Objects are automatically freed when the reference count reaches 0.
- __init__(self: tensorrt.tensorrt.OnnxParser, network: tensorrt.tensorrt.INetworkDefinition, logger: tensorrt.tensorrt.ILogger) None ¶
- Parameters:
network – The network definition to which the parser will write.
logger – The logger to use.
- clear_errors(self: tensorrt.tensorrt.OnnxParser) None ¶
Clear errors from prior calls to
parse()
- clear_flag(self: tensorrt.tensorrt.OnnxParser, flag: nvonnxparser::OnnxParserFlag) None ¶
Clears the parser flag from the enabled flags.
- Parameters:
flag – The flag to clear.
- get_error(self: tensorrt.tensorrt.OnnxParser, index: int) nvonnxparser::IParserError ¶
Get an error that occurred during prior calls to
parse()
- Parameters:
index – Index of the error
- get_flag(self: tensorrt.tensorrt.OnnxParser, flag: nvonnxparser::OnnxParserFlag) bool ¶
Check if a build mode flag is set.
- Parameters:
flag – The flag to check.
- Returns:
A bool indicating whether the flag is set.
- get_layer_output_tensor(self: tensorrt.tensorrt.OnnxParser, name: str, i: int) tensorrt.tensorrt.ITensor ¶
Get the i-th output ITensor object for the ONNX layer “name”.
In the case of multiple nodes sharing the same name this function will return the output tensors of the first instance of the node in the ONNX graph.
- arg name:
The name of the ONNX layer.
- arg i:
The index of the output.
- returns:
The output tensor or None if the layer was not found or an invalid index was provided.
- get_subgraph_nodes(self: tensorrt.tensorrt.OnnxParser, index: int) list ¶
Get the nodes of the specified subgraph. Calling before p supportsModelV2 is an undefined behavior. Will return an empty list by default.
- Parameters:
index – Index of the subgraph.
- Returns:
List[int] A list of node indices in the subgraph.
- get_used_vc_plugin_libraries(self: tensorrt.tensorrt.OnnxParser) List[str] ¶
Query the plugin libraries needed to implement operations used by the parser in a version-compatible engine.
This provides a list of plugin libraries on the filesystem needed to implement operations in the parsed network. If you are building a version-compatible engine using this network, provide this list to IBuilderConfig.set_plugins_to_serialize() to serialize these plugins along with the version-compatible engine, or, if you want to ship these plugin libraries externally to the engine, ensure that IPluginRegistry.load_library() is used to load these libraries in the appropriate runtime before deserializing the corresponding engine.
- Returns:
List[str] List of plugin libraries found by the parser.
- Raises:
RuntimeError
if an internal error occurred when trying to fetch the list of plugin libraries.
- is_subgraph_supported(self: tensorrt.tensorrt.OnnxParser, index: int) bool ¶
Returns whether the subgraph is supported. Calling before p supportsModelV2 is an undefined behavior. Will return false by default.
- Parameters:
index – Index of the subgraph to be checked.
- Returns:
true if subgraph is supported
- parse(self: tensorrt.tensorrt.OnnxParser, model: buffer, path: str = None) bool ¶
Parse a serialized ONNX model into the TensorRT network.
- Parameters:
model – The serialized ONNX model.
path – The path to the model file. Only required if the model has externally stored weights.
- Returns:
true if the model was parsed successfully
- parse_from_file(self: tensorrt.tensorrt.OnnxParser, model: str) bool ¶
Parse an ONNX model from file into a TensorRT network.
- Parameters:
model – The path to an ONNX model.
- Returns:
true if the model was parsed successfully
- parse_with_weight_descriptors(self: tensorrt.tensorrt.OnnxParser, model: buffer) bool ¶
Parse a serialized ONNX model into the TensorRT network with consideration of user provided weights.
- Parameters:
model – The serialized ONNX model.
- Returns:
true if the model was parsed successfully
- set_flag(self: tensorrt.tensorrt.OnnxParser, flag: nvonnxparser::OnnxParserFlag) None ¶
Add the input parser flag to the already enabled flags.
- Parameters:
flag – The flag to set.
- supports_model(self: tensorrt.tensorrt.OnnxParser, model: buffer, path: str = None) Tuple[bool, List[Tuple[List[int], bool]]] ¶
[DEPRECATED] Deprecated in TensorRT 10.1. See supports_model_v2.
Check whether TensorRT supports a particular ONNX model.
- Parameters:
model – The serialized ONNX model.
path – The path to the model file. Only required if the model has externally stored weights.
- Returns:
Tuple[bool, List[Tuple[NodeIndices, bool]]] The first element of the tuple indicates whether the model is supported. The second indicates subgraphs (by node index) in the model and whether they are supported.
- supports_model_v2(self: tensorrt.tensorrt.OnnxParser, model: buffer, path: str = None) bool ¶
Check whether TensorRT supports a particular ONNX model. Query each subgraph with num_subgraphs, is_subgraph_supported, get_subgraph_nodes.
- Parameters:
model – The serialized ONNX model.
path – The path to the model file. Only required if the model has externally stored weights.
- Returns:
true if the model is supported
- supports_operator(self: tensorrt.tensorrt.OnnxParser, op_name: str) bool ¶
Returns whether the specified operator may be supported by the parser. Note that a result of true does not guarantee that the operator will be supported in all cases (i.e., this function may return false-positives).
- Parameters:
op_name – The name of the ONNX operator to check for support
- class tensorrt.OnnxParserRefitter(self: tensorrt.tensorrt.OnnxParserRefitter, refitter: tensorrt.tensorrt.Refitter, logger: tensorrt.tensorrt.ILogger)¶
This is an interface designed to refit weights from an ONNX model.
- Parameters:
refitter – The Refitter object used to refit the model.
logger – The logger to use.
- __init__(self: tensorrt.tensorrt.OnnxParserRefitter, refitter: tensorrt.tensorrt.Refitter, logger: tensorrt.tensorrt.ILogger) None ¶
- Parameters:
refitter – The Refitter object used to refit the model.
logger – The logger to use.
- clear_errors(self: tensorrt.tensorrt.OnnxParserRefitter) None ¶
Clear errors from prior calls to
refitFromBytes()
orrefitFromFile()
.
- get_error(self: tensorrt.tensorrt.OnnxParserRefitter, index: int) tensorrt.tensorrt.ParserError ¶
Get an error that occurred during prior calls to
refitFromBytes()
orrefitFromFile()
.- Parameters:
index – Index of the error
- refit_from_bytes(self: tensorrt.tensorrt.OnnxParserRefitter, model: buffer, path: str = None) bool ¶
Load a serialized ONNX model from memory and perform weight refit.
- Parameters:
model – The serialized ONNX model.
path – The path to the model file. Only required if the model has externally stored weights.
- Returns:
true if all the weights in the engine were refit successfully.
- refit_from_file(self: tensorrt.tensorrt.OnnxParserRefitter, model: str) bool ¶
Load and parse a ONNX model from disk and perform weight refit.
- Parameters:
model – The path to an ONNX model.
- Returns:
true if the model was loaded successfully, and if all the weights in the engine were refit successfully.
- tensorrt.ErrorCode¶
The type of parser error
Members:
SUCCESS
INTERNAL_ERROR
MEM_ALLOC_FAILED
MODEL_DESERIALIZE_FAILED
INVALID_VALUE
INVALID_GRAPH
INVALID_NODE
UNSUPPORTED_GRAPH
UNSUPPORTED_NODE
UNSUPPORTED_NODE_ATTR
UNSUPPORTED_NODE_INPUT
UNSUPPORTED_NODE_DATATYPE
UNSUPPORTED_NODE_DYNAMIC
UNSUPPORTED_NODE_SHAPE
REFIT_FAILED
- class tensorrt.ParserError¶
- code(self: tensorrt.tensorrt.ParserError) tensorrt.tensorrt.ErrorCode ¶
- Returns:
The error code
- desc(self: tensorrt.tensorrt.ParserError) str ¶
- Returns:
Description of the error
- file(self: tensorrt.tensorrt.ParserError) str ¶
- Returns:
Source file in which the error occurred
- func(self: tensorrt.tensorrt.ParserError) str ¶
- Returns:
Source function in which the error occurred
- line(self: tensorrt.tensorrt.ParserError) int ¶
- Returns:
Source line at which the error occurred
- local_function_stack(self: tensorrt.tensorrt.ParserError) List[str] ¶
- Returns:
Current stack trace of local functions in which the error occurred
- local_function_stack_size(self: tensorrt.tensorrt.ParserError) int ¶
- Returns:
Size of the current stack trace of local functions in which the error occurred
- node(self: tensorrt.tensorrt.ParserError) int ¶
- Returns:
Index of the Onnx model node in which the error occurred
- node_name(self: tensorrt.tensorrt.ParserError) str ¶
- Returns:
Name of the node in the model in which the error occurred
- node_operator(self: tensorrt.tensorrt.ParserError) str ¶
- Returns:
Name of the node operation in the model in which the error occurred