Function TRTSERVER_InferenceRequestProviderNew¶
Defined in File trtserver.h
Function Documentation¶
-
TRTSERVER_Error *
TRTSERVER_InferenceRequestProviderNew
(TRTSERVER_InferenceRequestProvider **request_provider, TRTSERVER_Server *server, const char *model_name, int64_t model_version, const char *request_header_base, size_t request_header_byte_size)¶ TRTSERVER_InferenceRequestProvider.
Object representing the request provider for an inference request. The request provider provides the meta-data and input tensor values needed for an inference.Create a new inference request provider object. The request header protobuf must be serialized and provided as a base address and a size, in bytes.
- Return
a TRTSERVER_Error indicating success or failure.
- Parameters
request_provider
: Returns the new request provider object.server
: the inference server object.model_name
: The name of the model that the inference request is for.model_version
: The version of the model that the inference request is for, or -1 to select the latest (highest numbered) version.request_header_base
: Pointer to the serialized request header protobuf.request_header_byte_size
: The size of the serialized request header in bytes.