compass.llm.calling.ChatLLMCaller#

class ChatLLMCaller(llm_service, system_message, usage_tracker=None, **kwargs)[source]#

Bases: BaseLLMCaller

Class to support chat-like LLM calling functionality.

Parameters:
  • llm_service (compass.services.base.Service) – LLM service used for queries.

  • system_message (str) – System message to use for chat with LLM.

  • usage_tracker (compass.services.usage.UsageTracker, optional) – Optional tracker instance to monitor token usage during LLM calls. By default, None.

  • **kwargs – Keyword arguments to be passed to the underlying service processing function (i.e. llm_service.call(**kwargs)). Should not contain the following keys:

    • usage_tracker

    • usage_sub_label

    • messages

    These arguments are provided by this caller object.

Methods

call(content[, usage_sub_label])

Chat with the LLM.

async call(content, usage_sub_label=LLMUsageCategory.CHAT)[source]#

Chat with the LLM.

Parameters:
  • content (str) – Your chat message for the LLM.

  • usage_sub_label (str, optional) – Label to store token usage under. By default, "chat".

Returns:

str or None – The LLM response, as a string, or None if something went wrong during the call.