nat.data_models.intermediate_step#

Classes#

IntermediateStepCategory

str(object='') -> str

IntermediateStepType

str(object='') -> str

IntermediateStepState

str(object='') -> str

StreamEventData

StreamEventData is a data model that represents the data field in an streaming event.

UsageInfo

ToolParameters

ToolDetails

ToolSchema

TraceMetadata

IntermediateStepPayload

IntermediateStep is a data model that represents an intermediate step in the NAT. Intermediate steps are

IntermediateStep

IntermediateStep is a data model that represents an intermediate step in the NAT. Intermediate steps are

Module Contents#

class IntermediateStepCategory#

Bases: str, enum.Enum

str(object=’’) -> str str(bytes_or_buffer[, encoding[, errors]]) -> str

Create a new string object from the given object. If encoding or errors is specified, then the object must expose a data buffer that will be decoded using the given encoding and error handler. Otherwise, returns the result of object.__str__() (if defined) or repr(object). encoding defaults to sys.getdefaultencoding(). errors defaults to ‘strict’.

Initialize self. See help(type(self)) for accurate signature.

LLM = 'LLM'#
TOOL = 'TOOL'#
WORKFLOW = 'WORKFLOW'#
TASK = 'TASK'#
FUNCTION = 'FUNCTION'#
CUSTOM = 'CUSTOM'#
SPAN = 'SPAN'#
class IntermediateStepType#

Bases: str, enum.Enum

str(object=’’) -> str str(bytes_or_buffer[, encoding[, errors]]) -> str

Create a new string object from the given object. If encoding or errors is specified, then the object must expose a data buffer that will be decoded using the given encoding and error handler. Otherwise, returns the result of object.__str__() (if defined) or repr(object). encoding defaults to sys.getdefaultencoding(). errors defaults to ‘strict’.

Initialize self. See help(type(self)) for accurate signature.

LLM_START = 'LLM_START'#
LLM_END = 'LLM_END'#
LLM_NEW_TOKEN = 'LLM_NEW_TOKEN'#
TOOL_START = 'TOOL_START'#
TOOL_END = 'TOOL_END'#
WORKFLOW_START = 'WORKFLOW_START'#
WORKFLOW_END = 'WORKFLOW_END'#
TASK_START = 'TASK_START'#
TASK_END = 'TASK_END'#
FUNCTION_START = 'FUNCTION_START'#
FUNCTION_END = 'FUNCTION_END'#
CUSTOM_START = 'CUSTOM_START'#
CUSTOM_END = 'CUSTOM_END'#
SPAN_START = 'SPAN_START'#
SPAN_CHUNK = 'SPAN_CHUNK'#
SPAN_END = 'SPAN_END'#
class IntermediateStepState#

Bases: str, enum.Enum

str(object=’’) -> str str(bytes_or_buffer[, encoding[, errors]]) -> str

Create a new string object from the given object. If encoding or errors is specified, then the object must expose a data buffer that will be decoded using the given encoding and error handler. Otherwise, returns the result of object.__str__() (if defined) or repr(object). encoding defaults to sys.getdefaultencoding(). errors defaults to ‘strict’.

Initialize self. See help(type(self)) for accurate signature.

START = 'START'#
CHUNK = 'CHUNK'#
END = 'END'#
class StreamEventData(/, **data: Any)#

Bases: pydantic.BaseModel

StreamEventData is a data model that represents the data field in an streaming event.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

model_config#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

input: Any | None = None#
output: Any | None = None#
chunk: Any | None = None#
class UsageInfo(/, **data: Any)#

Bases: pydantic.BaseModel

token_usage: nat.profiler.callbacks.token_usage_base_model.TokenUsageBaseModel#
num_llm_calls: int = 0#
seconds_between_calls: int = 0#
class ToolParameters(/, **data: Any)#

Bases: pydantic.BaseModel

properties: dict[str, Any] = None#
required: list[str] = None#
type_: Literal['object'] = None#
additionalProperties: bool = None#
strict: bool = None#
class ToolDetails(/, **data: Any)#

Bases: pydantic.BaseModel

name: str = None#
description: str = None#
parameters: ToolParameters = None#
class ToolSchema(/, **data: Any)#

Bases: pydantic.BaseModel

type: Literal['function'] = None#
function: ToolDetails = None#
class TraceMetadata(/, **data: Any)#

Bases: pydantic.BaseModel

chat_responses: Any | None = None#
chat_inputs: Any | None = None#
tool_inputs: Any | None = None#
tool_outputs: Any | None = None#
tool_info: Any | None = None#
span_inputs: Any | None = None#
span_outputs: Any | None = None#
provided_metadata: Any | None = None#
tools_schema: list[ToolSchema] = None#
model_config#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class IntermediateStepPayload(/, **data: Any)#

Bases: pydantic.BaseModel

IntermediateStep is a data model that represents an intermediate step in the NAT. Intermediate steps are captured while a request is running and can be used to show progress or to evaluate the path a workflow took to get a response.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

model_config#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

event_type: IntermediateStepType#
event_timestamp: float = None#
span_event_timestamp: float | None = None#
framework: nat.builder.framework_enum.LLMFrameworkEnum | None = None#
name: str | None = None#
tags: list[str] | None = None#
metadata: dict[str, Any] | TraceMetadata | None = None#
data: StreamEventData | None = None#
usage_info: UsageInfo | None = None#
UUID: str = None#
property event_category: IntermediateStepCategory#
property event_state: IntermediateStepState#
check_span_event_timestamp() IntermediateStepPayload#
class IntermediateStep(/, **data: Any)#

Bases: pydantic.BaseModel

IntermediateStep is a data model that represents an intermediate step in the NAT. Intermediate steps are captured while a request is running and can be used to show progress or to evaluate the path a workflow took to get a response.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

model_config#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

parent_id: str#

The parent step ID for the current step. The parent ID is the ID of the last START step which has a different UUID than the current step. This value is different from the function_ancestry.parent_id value which tracks the last parent FUNCTION step. For the first START step, the parent_id is ‘root’.

function_ancestry: nat.data_models.invocation_node.InvocationNode#

The function ancestry for the current step showing the current NAT function that was being executed when the step was created.

payload: IntermediateStepPayload#

The payload for the current step.

property event_type: IntermediateStepType#
property event_timestamp: float#
property span_event_timestamp: float | None#
property framework: nat.builder.framework_enum.LLMFrameworkEnum | None#
property name: str | None#
property tags: list[str] | None#
property metadata: dict[str, Any] | TraceMetadata | None#
property data: StreamEventData | None#
property usage_info: UsageInfo | None#
property UUID: str#
property event_category: IntermediateStepCategory#
property event_state: IntermediateStepState#