TaskPrompt#
- class nemo_microservices.types.shared.TaskPrompt(*args: Any, **kwargs: Any)
Bases:
BaseModel
- task: str
The id of the task associated with this prompt.
- content: str | None = None
The content of the prompt, if it’s a string.
- max_length: int | None = None
The maximum length of the prompt in number of characters.
- max_tokens: int | None = None
The maximum number of tokens that can be generated in the chat completion.
- messages: List[MessageTemplate | str] | None = None
The list of messages included in the prompt. Used for chat models.
- mode: str | None = None
Corresponds to the prompting_mode for which this prompt is fetched.
Default is ‘standard’.
- models: List[str] | None = None
If specified, the prompt will be used only for the given LLM engines/models.
The format is a list of strings with the format: <engine> or <engine>/<model>.
- output_parser: str | None = None
The name of the output parser to use for this prompt.
- stop: List[str] | None = None
If specified, will be configure stop tokens for models that support this.