compass.llm.calling.StructuredLLMCaller#
- class StructuredLLMCaller(llm_service, usage_tracker=None, **kwargs)[source]#
Bases:
BaseLLMCallerClass to support structured (JSON) LLM calling functionality
See also
LLMCallerSimple LLM caller, with no memory and no parsing utilities.
ChatLLMCallerChat-like LLM calling functionality.
- Parameters:
llm_service (
Service) – LLM service used for queries.usage_tracker (
UsageTracker, optional) – Optional tracker instance to monitor token usage during LLM calls. By default,None.**kwargs –
Keyword arguments to be passed to the underlying service processing function (i.e.
llm_service.call(**kwargs)). Should not contain the following keys:usage_sub_label
messages
These arguments are provided by this caller object.
Methods
call(sys_msg, content[, usage_sub_label])Call LLM for structured data retrieval
- async call(sys_msg, content, usage_sub_label=LLMUsageCategory.DEFAULT)[source]#
Call LLM for structured data retrieval
- Parameters:
sys_msg (
str) – The LLM system message. If this text does not contain the instruction text “Return your answer as a dictionary in JSON format”, it will be added.content (
str) – LLM call content (typically some text to extract info from).usage_sub_label (
str, optional) – Label to store token usage under. By default,"default".
- Returns:
dict– Dictionary containing the LLM-extracted features. Dictionary may be empty if there was an error during the LLM call.