morpheus.messages.memory.inference_memory.InferenceMemoryNLP#
- class InferenceMemoryNLP(**kwargs)[source]#
Bases:
InferenceMemoryThis is a container class for data that needs to be submitted to the inference server for NLP category usecases.
- Parameters:
- input_idsNDArrayType
The token-ids for each string padded with 0s to max_length.
- input_maskNDArrayType
The mask for token-ids result where corresponding positions identify valid token-id values.
- seq_idsNDArrayType
Ids used to index from an inference input to a message. Necessary since there can be more inference inputs than messages (i.e., if some messages get broken into multiple inference requests).
- Attributes:
- input_ids
- input_mask
- seq_ids
- tensor_names
Methods
get_input(name)Get the tensor stored in the container identified by
name.get_tensor(name)Get the Tensor stored in the container identified by
name.Get the tensors contained by this instance.
has_tensor(name)Returns True if a tensor with the requested name exists in the tensors object
set_input(name, tensor)Update the input tensor identified by
name.set_tensor(name, tensor)Update the tensor identified by
name.set_tensors(tensors)Overwrite the tensors stored by this instance.
- get_input(name)[source]#
Get the tensor stored in the container identified by
name. Alias forInferenceMemory.get_tensor.- Parameters:
- namestr
Key used to do lookup in inputs dict of the container.
- Returns:
- NDArrayType
Inputs corresponding to name.
- Raises:
- KeyError
If input name does not exist in the container.
- get_tensor(name)[source]#
Get the Tensor stored in the container identified by
name.- Parameters:
- namestr
Tensor key name.
- Returns:
- NDArrayType
Tensor.
- Raises:
- KeyError
If tensor name does not exist in the container.
- get_tensors()[source]#
Get the tensors contained by this instance. It is important to note that when C++ execution is enabled the returned tensors will be a Python copy of the tensors stored in the C++ object. As such any changes made to the tensors will need to be updated with a call to
set_tensors.- Returns:
- TensorMapType
- has_tensor(name)[source]#
Returns True if a tensor with the requested name exists in the tensors object
- Parameters:
- namestr
Name to lookup
- Returns:
- bool
True if the tensor was found
- set_input(name, tensor)[source]#
Update the input tensor identified by
name. Alias forInferenceMemory.set_tensor- Parameters:
- namestr
Key used to do lookup in inputs dict of the container.
- tensorNDArrayType
Tensor as either CuPy or NumPy array.