NVIDIA Holoscan SDK v3.0.0

Function holoscan::utils::transmit_data_per_model(gxf_context_t&, const HoloInfer::MultiMappings&, HoloInfer::DataMap&, OutputContext&, std::vector&, HoloInfer::DimType&, bool, bool, const nvidia::gxf::Handle&, const std::string&, const cudaStream_t&)

gxf_result_t holoscan::utils::transmit_data_per_model(gxf_context_t &cont, const HoloInfer::MultiMappings &model_to_tensor_map, HoloInfer::DataMap &input_data_map, OutputContext &op_output, std::vector<std::string> &out_tensors, HoloInfer::DimType &tensor_out_dims_map, bool cuda_buffer_in, bool cuda_buffer_out, const nvidia::gxf::Handle<nvidia::gxf::Allocator> &allocator_, const std::string &module, const cudaStream_t &cstream)

Transmits multiple buffers via GXF Transmitters.

Parameters
  • context – GXF context for transmission

  • model_to_tensor_mapMap of model name as key, mapped to a vector of tensor names

  • input_data_mapMap of tensor name as key, mapped to the data buffer as a vector

  • op_output – Output context. Assume that the output port’s name is “transmitter”.

  • out_tensors – Output tensor names

  • data_per_modelMap is updated with output tensor name as key mapped to data buffer

  • tensor_out_dims_mapMap is updated with model name as key mapped to dimension of output tensor as a vector

  • cuda_buffer_in – Flag to demonstrate if memory storage of input buffers is on CUDA

  • cuda_buffer_out – Flag to demonstrate if memory storage of output message is on CUDA

  • allocator – GXF Memory allocator

  • module – Module that called for data transmission

  • cstream – The CUDA stream to use for async memory copies.

Returns

GXF result code

Previous Function holoscan::utils::transmit_data_per_model(gxf_context_t&, const HoloInfer::MultiMappings&, HoloInfer::DataMap&, OutputContext&, std::vector<std::string>&, HoloInfer::DimType&, bool, bool, const nvidia::gxf::Handle<nvidia::gxf::Allocator>&, const std::string&, CudaStreamHandler&)
Next Function holoscan::viz::Begin
© Copyright 2022-2025, NVIDIA. Last updated on Mar 12, 2025.