elm.ords.extraction.tree.AsyncDecisionTree

class AsyncDecisionTree(graph)[source]

Bases: DecisionTree

Async class to traverse a directed graph of LLM prompts. Nodes are prompts and edges are transitions between prompts based on conditions being met in the LLM response

Purpose:

Represent a series of prompts that can be used in sequence to extract values of interest from text.

Responsibilities:
  1. Store all prompts used to extract a particular ordinance value from text.

  2. Track relationships between the prompts (i.e. which prompts is used first, which prompt is used next depending on the output of the previous prompt, etc.) using a directed acyclic graph.

Key Relationships:

Inherits from DecisionTree to add async capabilities. Uses a ChatLLMCaller for LLm queries.

Async class to traverse a directed graph of LLM prompts. Nodes are prompts and edges are transitions between prompts based on conditions being met in the LLM response.

Parameters:

graph (nx.DiGraph) – Directed acyclic graph where nodes are LLM prompts and edges are logical transitions based on the response. Must have high-level graph attribute “chat_llm_caller” which is a ChatLLMCaller instance. Nodes should have attribute “prompt” which can have {format} named arguments that will be filled from the high-level graph attributes. Edges can have attribute “condition” that is a callable to be executed on the LLM response text. An edge from a node without a condition acts as an “else” statement if no other edge conditions are satisfied. A single edge from node to node does not need a condition.

Methods

async_call_node(node0)

Call the LLM with the prompt from the input node and search the successor edges for a valid transition condition

async_run([node0])

Traverse the decision tree starting at the input node.

call_node(node_name)

Call the LLM with the prompt from the input node and search the successor edges for a valid transition condition

run([node0])

Traverse the decision tree starting at the input node.

Attributes

all_messages_txt

Get a printout of the full conversation with the LLM

api

Get the ApiBase object.

chat_llm_caller

ChatLLMCaller instance for this tree.

graph

Get the networkx graph object

history

Get a record of the nodes traversed in the tree

messages

Get a list of the conversation messages with the LLM.

property chat_llm_caller

ChatLLMCaller instance for this tree.

Type:

elm.ords.llm.ChatLLMCaller

property messages

Get a list of the conversation messages with the LLM.

Returns:

list

property all_messages_txt

Get a printout of the full conversation with the LLM

Returns:

str

async async_call_node(node0)[source]

Call the LLM with the prompt from the input node and search the successor edges for a valid transition condition

Parameters:

node0 (str) – Name of node being executed.

Returns:

out (str) – Next node or LLM response if at a leaf node.

async async_run(node0='init')[source]

Traverse the decision tree starting at the input node.

Parameters:

node0 (str) – Name of starting node in the graph. This is typically called “init”

Returns:

out (str) – Final response from LLM at the leaf node.

property api

Get the ApiBase object.

Returns:

ApiBase

call_node(node_name)

Call the LLM with the prompt from the input node and search the successor edges for a valid transition condition

Parameters:

node_name (str) – Name of node being executed.

Returns:

out (str) – Next node or LLM response if at a leaf node.

property graph

Get the networkx graph object

Returns:

nx.DiGraph

property history

Get a record of the nodes traversed in the tree

Returns:

list

run(node0='init')

Traverse the decision tree starting at the input node.

Parameters:

node0 (str) – Name of starting node in the graph. This is typically called “init”

Returns:

out (str) – Final response from LLM at the leaf node.