Class of the nvinferserver element implementation.
Definition at line 142 of file gstnvinferserver_impl.h.
◆ GstNvInferServerImpl()
gstnvinferserver::GstNvInferServerImpl::GstNvInferServerImpl |
( |
GstNvInferServer * |
infer | ) |
|
Constructor, registers the handle of the parent GStreamer element.
- Parameters
-
[in] | infer | Pointer to the nvinferserver GStreamer element. |
◆ ~GstNvInferServerImpl()
gstnvinferserver::GstNvInferServerImpl::~GstNvInferServerImpl |
( |
| ) |
|
◆ addTrackingSource()
bool gstnvinferserver::GstNvInferServerImpl::addTrackingSource |
( |
uint32_t |
sourceId | ) |
|
Add a new source to the object history structure.
Whenever a new source is added to the pipeline, corresponding source ID is captured in the GStreamer event on the sink pad and the object history for this source is initialized.
- Parameters
-
◆ canSupportGpu()
bool gstnvinferserver::GstNvInferServerImpl::canSupportGpu |
( |
int |
gpuId | ) |
const |
◆ classifierType()
const std::string& gstnvinferserver::GstNvInferServerImpl::classifierType |
( |
| ) |
const |
|
inline |
◆ config()
const ic::PluginControl& gstnvinferserver::GstNvInferServerImpl::config |
( |
| ) |
const |
|
inline |
◆ eraseTrackingSource()
void gstnvinferserver::GstNvInferServerImpl::eraseTrackingSource |
( |
uint32_t |
sourceId | ) |
|
Removes a source from the object history structure.
Whenever a source is removed from the pipeline, corresponding source ID is captured in the GStreamer event on the sink pad and the object history for this source is deleted.
- Parameters
-
◆ isAsyncMode()
bool gstnvinferserver::GstNvInferServerImpl::isAsyncMode |
( |
| ) |
const |
◆ lastError()
NvDsInferStatus gstnvinferserver::GstNvInferServerImpl::lastError |
( |
| ) |
const |
◆ maxBatchSize()
uint32_t gstnvinferserver::GstNvInferServerImpl::maxBatchSize |
( |
| ) |
const |
◆ nvtxDomain()
nvtxDomainHandle_t gstnvinferserver::GstNvInferServerImpl::nvtxDomain |
( |
| ) |
|
|
inline |
◆ processBatchMeta()
Submits the input batch for inference.
This function submits the input batch buffer for inferencing as per the configured processing mode: full frame inference, inference on detected objects or inference on attached input tensors.
- Parameters
-
[in,out] | batchMeta | NvDsBatchMeta associated with the input buffer. |
[in] | inSurf | Input batch buffer. |
[in] | seqId | The sequence number of the input batch. |
[in] | gstBuf | Pointer to the input GStreamer buffer. |
◆ queueOperation()
NvDsInferStatus gstnvinferserver::GstNvInferServerImpl::queueOperation |
( |
FuncItem |
func | ) |
|
Queues the inference done operation for the request to the output thread.
◆ resetIntervalCounter()
void gstnvinferserver::GstNvInferServerImpl::resetIntervalCounter |
( |
| ) |
|
Resets the inference interval used in frame process mode to 0.
◆ setRawoutputCb()
Saves the callback function pointer for the raw tensor output.
- Parameters
-
[in] | cb | Pointer to the callback function. |
Definition at line 264 of file gstnvinferserver_impl.h.
◆ start()
Reads the configuration file and sets up processing context.
This function reads the configuration file and validates the user provided configuration. Configuration file settings are overridden with those set by element properties. It then creates the inference context, initializes it and starts the output thread. The inference context is either InferGrpcContext (Triton Inference Server in gRPC mode) or InferTrtISContext (Triton Inference server C-API mode) depending on the configuration setting. Object history is initialized for source 0.
◆ stop()
Deletes the inference context.
This function waits for the output thread to finish and then de-initializes the inference context. The object history is cleared.
◆ sync()
Waits for the output thread to finish processing queued operations.
◆ uniqueId()
uint32_t gstnvinferserver::GstNvInferServerImpl::uniqueId |
( |
| ) |
const |
|
inline |
◆ updateInterval()
void gstnvinferserver::GstNvInferServerImpl::updateInterval |
( |
guint |
interval | ) |
|
|
inline |
◆ m_GstProperties
The documentation for this class was generated from the following file: