elm.ords.services.openai.count_tokens
- count_tokens(messages, model)[source]
Count the number of tokens in an outgoing set of messages.
- Parameters:
messages (list) – A list of message objects, where the latter is represented using a dictionary. Each message dictionary must have a “content” key containing the string to count tokens for.
model (str) – The OpenAI model being used. This input will be passed to
tiktoken.encoding_for_model()
.
- Returns:
int – Total number of tokens in the set of messages outgoing to OpenAI.
References
https://github.com/openai/openai-cookbook/blob/main/examples/How_to_count_tokens_with_tiktoken.ipynb