Onnx Parser

class tensorrt.OnnxParser(self: tensorrt.tensorrt.OnnxParser, network: tensorrt.tensorrt.INetworkDefinition, logger: tensorrt.tensorrt.ILogger) None

This class is used for parsing ONNX models into a TensorRT network definition

Variables

num_errorsint The number of errors that occurred during prior calls to parse()

Parameters
  • network – The network definition to which the parser will write.

  • logger – The logger to use.

__del__(self: tensorrt.tensorrt.OnnxParser) None
__exit__(exc_type, exc_value, traceback)

Context managers are deprecated and have no effect. Objects are automatically freed when the reference count reaches 0.

__init__(self: tensorrt.tensorrt.OnnxParser, network: tensorrt.tensorrt.INetworkDefinition, logger: tensorrt.tensorrt.ILogger) None
Parameters
  • network – The network definition to which the parser will write.

  • logger – The logger to use.

clear_errors(self: tensorrt.tensorrt.OnnxParser) None

Clear errors from prior calls to parse()

get_error(self: tensorrt.tensorrt.OnnxParser, index: int) nvonnxparser::IParserError

Get an error that occurred during prior calls to parse()

Parameters

index – Index of the error

parse(self: tensorrt.tensorrt.OnnxParser, model: buffer, path: str = None) bool

Parse a serialized ONNX model into the TensorRT network.

Parameters
  • model – The serialized ONNX model.

  • path – The path to the model file. Only required if the model has externally stored weights.

Returns

true if the model was parsed successfully

parse_from_file(self: tensorrt.tensorrt.OnnxParser, model: str) bool

Parse an ONNX model from file into a TensorRT network.

Parameters

model – The path to an ONNX model.

Returns

true if the model was parsed successfully

parse_with_weight_descriptors(self: tensorrt.tensorrt.OnnxParser, model: buffer) bool

Parse a serialized ONNX model into the TensorRT network with consideration of user provided weights.

Parameters

model – The serialized ONNX model.

Returns

true if the model was parsed successfully

supports_model(self: tensorrt.tensorrt.OnnxParser, model: buffer, path: str = None) Tuple[bool, tensorrt.tensorrt.SubGraphCollection]

Check whether TensorRT supports a particular ONNX model.

Parameters
  • model – The serialized ONNX model.

  • path – The path to the model file. Only required if the model has externally stored weights.

Returns

Tuple[bool, List[Tuple[NodeIndices, bool]]] The first element of the tuple indicates whether the model is supported. The second indicates subgraphs (by node index) in the model and whether they are supported.

supports_operator(self: tensorrt.tensorrt.OnnxParser, op_name: str) bool

Returns whether the specified operator may be supported by the parser. Note that a result of true does not guarantee that the operator will be supported in all cases (i.e., this function may return false-positives).

Parameters

op_name – The name of the ONNX operator to check for support

tensorrt.ErrorCode

The type of parser error

Members:

SUCCESS

INTERNAL_ERROR

MEM_ALLOC_FAILED

MODEL_DESERIALIZE_FAILED

INVALID_VALUE

INVALID_GRAPH

INVALID_NODE

UNSUPPORTED_GRAPH

UNSUPPORTED_NODE

class tensorrt.ParserError
code(self: tensorrt.tensorrt.ParserError) tensorrt.tensorrt.ErrorCode
Returns

The error code

desc(self: tensorrt.tensorrt.ParserError) str
Returns

Description of the error

file(self: tensorrt.tensorrt.ParserError) str
Returns

Source file in which the error occurred

func(self: tensorrt.tensorrt.ParserError) str
Returns

Source function in which the error occurred

line(self: tensorrt.tensorrt.ParserError) int
Returns

Source line at which the error occurred

node(self: tensorrt.tensorrt.ParserError) int
Returns

Index of the Onnx model node in which the error occurred