elm.ords.llm.calling.BaseLLMCaller
- class BaseLLMCaller(llm_service, usage_tracker=None, **kwargs)[source]
Bases:
object
Class to support LLM calling functionality
- Purpose:
Helper classes to call LLMs.
- Responsibilities:
Use a service (e.g.
OpenAIService
) to query an LLM.- Maintain a useful context to simplify LLM query.
Typically these classes are initialized with a single LLM model (and optionally a usage tracker)
This context is passed to every
Service.call
invocation, allowing user to focus on only the message.
Track message history (
ChatLLMCaller
) or convert output into JSON (StructuredLLMCaller
).
- Key Relationships:
Delegates most of work to underlying
Service
class.
- Parameters:
llm_service (elm.ords.services.base.Service) – LLM service used for queries.
usage_tracker (elm.ords.services.usage.UsageTracker, optional) – Optional tracker instance to monitor token usage during LLM calls. By default,
None
.**kwargs – Keyword arguments to be passed to the underlying service processing function (i.e. llm_service.call(**kwargs)). Should not contain the following keys:
usage_tracker
usage_sub_label
messages
These arguments are provided by this caller object.
Methods