nemo_microservices.types.shared.task_prompt#
Module Contents#
Classes#
Data#
API#
- nemo_microservices.types.shared.task_prompt.Message: typing_extensions.TypeAlias#
None
- class nemo_microservices.types.shared.task_prompt.TaskPrompt(/, **data: typing.Any)#
Bases:
nemo_microservices._models.BaseModel- content: Optional[str]#
None
The content of the prompt, if it’s a string.
- max_length: Optional[int]#
None
The maximum length of the prompt in number of characters.
- max_tokens: Optional[int]#
None
The maximum number of tokens that can be generated in the chat completion.
- messages: Optional[List[nemo_microservices.types.shared.task_prompt.Message]]#
None
The list of messages included in the prompt. Used for chat models.
- mode: Optional[str]#
None
Corresponds to the
prompting_modefor which this prompt is fetched.Default is ‘standard’.
- models: Optional[List[str]]#
None
If specified, the prompt will be used only for the given LLM engines/models.
The format is a list of strings with the format:
or / .
- output_parser: Optional[str]#
None
The name of the output parser to use for this prompt.
- stop: Optional[List[str]]#
None
If specified, will be configure stop tokens for models that support this.
- task: str#
None
The id of the task associated with this prompt.