nat.llm.prediction_context#
Runtime context management for prediction trie lookups.
Provides tracking of LLM call indices per function invocation, enabling accurate lookups in the prediction trie at runtime.
Attributes#
Classes#
Tracks LLM call counts per function invocation. |
Functions#
|
Get the LLMCallTracker for the current context. |
Module Contents#
- class LLMCallTracker#
Tracks LLM call counts per function invocation.
- _llm_call_tracker: contextvars.ContextVar[LLMCallTracker]#
- get_call_tracker() LLMCallTracker#
Get the LLMCallTracker for the current context.
Creates a new tracker if one doesn’t exist in the current context.
- Returns:
The LLMCallTracker for this context