elm.ords.llm.calling.StructuredLLMCaller

class StructuredLLMCaller(llm_service, usage_tracker=None, **kwargs)[source]

Bases: BaseLLMCaller

Class to support structured (JSON) LLM calling functionality.

Parameters:
  • llm_service (elm.ords.services.base.Service) – LLM service used for queries.

  • usage_tracker (elm.ords.services.usage.UsageTracker, optional) – Optional tracker instance to monitor token usage during LLM calls. By default, None.

  • **kwargs – Keyword arguments to be passed to the underlying service processing function (i.e. llm_service.call(**kwargs)). Should not contain the following keys:

    • usage_tracker

    • usage_sub_label

    • messages

    These arguments are provided by this caller object.

Methods

call(sys_msg, content[, usage_sub_label])

Call LLM for structured data retrieval.

async call(sys_msg, content, usage_sub_label='default')[source]

Call LLM for structured data retrieval.

Parameters:
  • sys_msg (str) – The LLM system message. If this text does not contain the instruction text “Return your answer in JSON format”, it will be added.

  • content (str) – LLM call content (typically some text to extract info from).

  • usage_sub_label (str, optional) – Label to store token usage under. By default, "default".

Returns:

dict – Dictionary containing the LLM-extracted features. Dictionary may be empty if there was an error during the LLM call.