aiq.profiler.callbacks.llama_index_callback_handler#

Attributes#

Classes#

LlamaIndexProfilerHandler

A callback handler for LlamaIndex that tracks usage stats similarly to NIMCallbackHandler.

Module Contents#

logger#
class LlamaIndexProfilerHandler#

Bases: llama_index.core.callbacks.base_handler.BaseCallbackHandler, aiq.profiler.callbacks.base_callback_class.BaseProfilerCallback

A callback handler for LlamaIndex that tracks usage stats similarly to NIMCallbackHandler. Collects:

  • Prompts

  • Token usage

  • Response data

  • Time intervals between calls

and appends them to AIQContextState.usage_stats.

Initialize the base callback handler.

_lock#
last_call_ts#
_last_tool_map: dict[str, str]#
step_manager#
_run_id_to_llm_input#
_run_id_to_tool_input#
_run_id_to_timestamp#
on_event_start(
event_type: llama_index.core.callbacks.CBEventType,
payload: dict[str, Any] | None = None,
event_id: str = '',
parent_id: str = '',
**kwargs: Any,
) str#

Called at the start of a LlamaIndex “event” (LLM call, Embedding, etc.). We capture the prompts or query strings here, if any.

on_event_end(
event_type: llama_index.core.callbacks.CBEventType,
payload: dict[str, Any] | None = None,
event_id: str = '',
**kwargs: Any,
) None#

Called at the end of a LlamaIndex “event”. We collect token usage (if available) and the returned response text.

start_trace(trace_id: str | None = None) None#

Run when an overall trace is launched.

end_trace(
trace_id: str | None = None,
trace_map: dict[str, list[str]] | None = None,
) None#

Run when an overall trace is exited.