nat.llm.prediction_context#

Runtime context management for prediction trie lookups.

Provides tracking of LLM call indices per function invocation, enabling accurate lookups in the prediction trie at runtime.

Attributes#

Classes#

LLMCallTracker

Tracks LLM call counts per function invocation.

Functions#

get_call_tracker(→ LLMCallTracker)

Get the LLMCallTracker for the current context.

Module Contents#

class LLMCallTracker#

Tracks LLM call counts per function invocation.

counts: dict[str, int]#
increment(parent_function_id: str) int#

Increment and return the call index for this parent.

Args:

parent_function_id: Unique ID of the parent function invocation

Returns:

The call index (1-indexed) for this LLM call within the parent

reset(parent_function_id: str) None#

Reset call count when a function invocation completes.

Args:

parent_function_id: Unique ID of the parent function invocation

_llm_call_tracker: contextvars.ContextVar[LLMCallTracker]#
get_call_tracker() LLMCallTracker#

Get the LLMCallTracker for the current context.

Creates a new tracker if one doesn’t exist in the current context.

Returns:

The LLMCallTracker for this context