nemo_microservices.types.llm_call_info#
Module Contents#
Classes#
API#
- class nemo_microservices.types.llm_call_info.LlmCallInfo(/, **data: typing.Any)#
Bases:
nemo_microservices._models.BaseModel- completion: Optional[str]#
None
The completion generated by the LLM.
- completion_tokens: Optional[int]#
None
The number of output tokens.
- duration: Optional[float]#
None
The duration in seconds.
- finished_at: Optional[float]#
None
The timestamp for when the LLM call finished.
- id: Optional[str]#
None
The unique prompt identifier.
- llm_model_name: Optional[str]#
None
The name of the model use for the LLM call.
- prompt: Optional[str]#
None
The prompt that was used for the LLM call.
- prompt_tokens: Optional[int]#
None
The number of input tokens.
- raw_response: Optional[Dict[str, object]]#
None
The raw response received from the LLM.
May contain additional information, e.g. logprobs.
- started_at: Optional[float]#
None
The timestamp for when the LLM call started.
- task: Optional[str]#
None
The internal task that made the call.
- total_tokens: Optional[int]#
None
The total number of used tokens.