NVIDIA TensorRT Inference Server
0.11.0 -0000000
Version select:
Current release
master (unstable)
Older releases
Documentation home
User Guide
Quickstart
Prerequisites
Using A Prebuilt Docker Container
Building From Source Code
Run TensorRT Inference Server
Verify Inference Server Is Running Correctly
Building The Client Examples
Running The Image Classification Example
Installing the Server
Installing Prebuilt Containers
Running the Server
Example Model Repository
Running The Inference Server
Checking Inference Server Status
Client Libraries and Examples
Building the Client Libraries and Examples
Image Classification Example Application
Performance Example Application
Client API
String Datatype
Model Repository
Modifying the Model Repository
Model Versions
Framework Model Definition
TensorRT Models
TensorFlow Models
Caffe2 Models
TensorRT/TensorFlow Models
ONNX Models
Custom Backends
Custom Backend API
Example Custom Backend
Model Configuration
Generated Model Configuration
Datatypes
Version Policy
Instance Groups
Dynamic Batching
Optimization Policy
Inference Server API
Health
Status
Inference
Metrics
Developer Guide
Architecture
Concurrent Model Execution
Contributing
Coding Convention
Building
Building the Server
Incremental Builds
Building the Client Libraries and Examples
Building the Documentation
Testing
Generate QA Model Repositories
Build QA Container
Run QA Container
API Reference
Protobuf API
HTTP/GRPC API
Model Configuration
Status
C++ API
Class Hierarchy
File Hierarchy
Full API
Namespaces
Namespace nvidia
Namespace nvidia::inferenceserver
Namespace nvidia::inferenceserver::client
Classes and Structs
Struct custom_payload_struct
Struct Result::ClassResult
Struct InferContext::Stat
Class Error
Class InferContext
Class InferContext::Input
Class InferContext::Options
Class InferContext::Output
Class InferContext::Request
Class InferContext::RequestTimers
Class InferContext::Result
Class InferGrpcContext
Class InferHttpContext
Class ProfileContext
Class ProfileGrpcContext
Class ProfileHttpContext
Class ServerHealthContext
Class ServerHealthGrpcContext
Class ServerHealthHttpContext
Class ServerStatusContext
Class ServerStatusGrpcContext
Class ServerStatusHttpContext
Functions
Function CustomErrorString
Function CustomExecute
Function CustomFinalize
Function CustomInitialize
Function nvidia::inferenceserver::client::operator<<
Defines
Define CUSTOM_NO_GPU_DEVICE
Typedefs
Typedef CustomErrorStringFn_t
Typedef CustomExecuteFn_t
Typedef CustomFinalizeFn_t
Typedef CustomGetNextInputFn_t
Typedef CustomGetOutputFn_t
Typedef CustomInitializeFn_t
Typedef CustomPayload
Directories
Directory src
Directory clients
Directory c++
Directory servables
Directory custom
Files
File custom.h
File request.h
Python API
Client
NVIDIA TensorRT Inference Server
Docs
»
Full API
View page source
Full API
¶
Namespaces
¶
Namespace nvidia
Namespaces
Namespace nvidia::inferenceserver
Namespaces
Namespace nvidia::inferenceserver::client
Classes
Functions
Classes and Structs
¶
Struct custom_payload_struct
Struct Documentation
Struct Result::ClassResult
Nested Relationships
Struct Documentation
Struct InferContext::Stat
Nested Relationships
Struct Documentation
Class Error
Class Documentation
Class InferContext
Nested Relationships
Nested Types
Inheritance Relationships
Derived Types
Class Documentation
Class InferContext::Input
Nested Relationships
Class Documentation
Class InferContext::Options
Nested Relationships
Class Documentation
Class InferContext::Output
Nested Relationships
Class Documentation
Class InferContext::Request
Nested Relationships
Class Documentation
Class InferContext::RequestTimers
Nested Relationships
Class Documentation
Class InferContext::Result
Nested Relationships
Nested Types
Class Documentation
Class InferGrpcContext
Inheritance Relationships
Base Type
Class Documentation
Class InferHttpContext
Inheritance Relationships
Base Type
Class Documentation
Class ProfileContext
Inheritance Relationships
Derived Types
Class Documentation
Class ProfileGrpcContext
Inheritance Relationships
Base Type
Class Documentation
Class ProfileHttpContext
Inheritance Relationships
Base Type
Class Documentation
Class ServerHealthContext
Inheritance Relationships
Derived Types
Class Documentation
Class ServerHealthGrpcContext
Inheritance Relationships
Base Type
Class Documentation
Class ServerHealthHttpContext
Inheritance Relationships
Base Type
Class Documentation
Class ServerStatusContext
Inheritance Relationships
Derived Types
Class Documentation
Class ServerStatusGrpcContext
Inheritance Relationships
Base Type
Class Documentation
Class ServerStatusHttpContext
Inheritance Relationships
Base Type
Class Documentation
Functions
¶
Function CustomErrorString
Function Documentation
Function CustomExecute
Function Documentation
Function CustomFinalize
Function Documentation
Function CustomInitialize
Function Documentation
Function nvidia::inferenceserver::client::operator<<
Function Documentation
Defines
¶
Define CUSTOM_NO_GPU_DEVICE
Define Documentation
Typedefs
¶
Typedef CustomErrorStringFn_t
Typedef Documentation
Typedef CustomExecuteFn_t
Typedef Documentation
Typedef CustomFinalizeFn_t
Typedef Documentation
Typedef CustomGetNextInputFn_t
Typedef Documentation
Typedef CustomGetOutputFn_t
Typedef Documentation
Typedef CustomInitializeFn_t
Typedef Documentation
Typedef CustomPayload
Typedef Documentation
Directories
¶
Directory src
Subdirectories
Directory clients
Subdirectories
Directory c++
Files
Directory servables
Subdirectories
Directory custom
Files
Files
¶
File custom.h
Definition (
src/servables/custom/custom.h
)
Program Listing for File custom.h
Includes
Classes
Functions
Defines
Typedefs
File request.h
Definition (
src/clients/c++/request.h
)
Program Listing for File request.h
Includes
Namespaces
Classes
Functions