Class: LLM::Bot

Inherits:
Object
  • Object
show all
Defined in:
lib/llm/bot.rb

Overview

LLM::Bot provides an object that can maintain a conversation. A conversation can use the chat completions API that all LLM providers support or the responses API that currently only OpenAI supports.

Examples:

#!/usr/bin/env ruby
require "llm"

llm  = LLM.openai(key: ENV["KEY"])
bot  = LLM::Bot.new(llm)
url  = "https://upload.wikimedia.org/wikipedia/commons/c/c7/Lisc_lipy.jpg"

prompt = bot.build_prompt do
  it.system "Your task is to answer all user queries"
  it.user ["Tell me about this URL", bot.image_url(url)]
  it.user ["Tell me about this PDF", bot.local_file("handbook.pdf")]
end
bot.chat(prompt)

# The full conversation history is in bot.messages
bot.messages.each { print "[#{_1.role}] ", _1.content, "\n" }

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(provider, params = {}) ⇒ Bot

Returns a new instance of Bot.

Parameters:

  • provider (LLM::Provider)

    A provider

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Options Hash (params):

  • :model (String)

    Defaults to the provider's default model

  • :tools (Array<LLM::Function>, nil)

    Defaults to nil



42
43
44
45
46
# File 'lib/llm/bot.rb', line 42

def initialize(provider, params = {})
  @provider = provider
  @params = {model: provider.default_model, schema: nil}.compact.merge!(params)
  @messages = LLM::Buffer.new(provider)
end

Instance Attribute Details

#messagesLLM::Buffer<LLM::Message> (readonly)

Returns an Enumerable for the messages in a conversation



31
32
33
# File 'lib/llm/bot.rb', line 31

def messages
  @messages
end

Instance Method Details

#build_promptObject

Build a prompt

Examples:

prompt = bot.build_prompt do
  it.system "Your task is to assist the user"
  it.user "Hello, can you assist me?"
end
bot.chat(prompt)


133
134
135
# File 'lib/llm/bot.rb', line 133

def build_prompt(&)
  LLM::Builder.new(@provider, &).tap(&:call)
end

#chat(prompt, params = {}) ⇒ LLM::Response

Maintain a conversation via the chat completions API. This method immediately sends a request to the LLM and returns the response.

Examples:

llm = LLM.openai(key: ENV["KEY"])
bot = LLM::Bot.new(llm)
response = bot.chat("Hello, what is your name?")
puts response.choices[0].content

Parameters:

  • params (defaults to: {})

    The params, including optional :role (defaults to :user), :stream, :tools, :schema etc.

  • prompt (String)

    The input prompt to be completed

Returns:



60
61
62
63
64
65
66
67
68
69
# File 'lib/llm/bot.rb', line 60

def chat(prompt, params = {})
  prompt, params, messages = fetch(prompt, params)
  params = params.merge(messages: [*@messages.to_a, *messages])
  params = @params.merge(params)
  res = @provider.complete(prompt, params)
  @messages.concat [LLM::Message.new(params[:role] || :user, prompt)]
  @messages.concat messages
  @messages.concat [res.choices[-1]]
  res
end

#functionsArray<LLM::Function>

Returns an array of functions that can be called

Returns:



107
108
109
110
111
112
# File 'lib/llm/bot.rb', line 107

def functions
  @messages
    .select(&:assistant?)
    .flat_map(&:functions)
    .select(&:pending?)
end

#image_url(url) ⇒ LLM::Object

Recongize an object as a URL to an image

Parameters:

  • url (String)

    The URL

Returns:



143
144
145
# File 'lib/llm/bot.rb', line 143

def image_url(url)
  LLM::Object.from(value: url, kind: :image_url)
end

#inspectString

Returns:

  • (String)


98
99
100
101
102
# File 'lib/llm/bot.rb', line 98

def inspect
  "#<#{self.class.name}:0x#{object_id.to_s(16)} " \
  "@provider=#{@provider.class}, @params=#{@params.inspect}, " \
  "@messages=#{@messages.inspect}>"
end

#local_file(path) ⇒ LLM::Object

Recongize an object as a local file

Parameters:

  • path (String)

    The path

Returns:



153
154
155
# File 'lib/llm/bot.rb', line 153

def local_file(path)
  LLM::Object.from(value: LLM.File(path), kind: :local_file)
end

#remote_file(res) ⇒ LLM::Object

Reconginize an object as a remote file

Parameters:

Returns:



163
164
165
# File 'lib/llm/bot.rb', line 163

def remote_file(res)
  LLM::Object.from(value: res, kind: :remote_file)
end

#respond(prompt, params = {}) ⇒ LLM::Response

Note:

Not all LLM providers support this API

Maintain a conversation via the responses API. This method immediately sends a request to the LLM and returns the response.

Examples:

llm = LLM.openai(key: ENV["KEY"])
bot = LLM::Bot.new(llm)
res = bot.respond("What is the capital of France?")
puts res.output_text

Parameters:

  • params (defaults to: {})

    The params, including optional :role (defaults to :user), :stream, :tools, :schema etc.

  • prompt (String)

    The input prompt to be completed

Returns:



84
85
86
87
88
89
90
91
92
93
94
# File 'lib/llm/bot.rb', line 84

def respond(prompt, params = {})
  prompt, params, messages = fetch(prompt, params)
  res_id = @messages.find(&:assistant?)&.response&.response_id
  params = params.merge(previous_response_id: res_id, input: messages).compact
  params = @params.merge(params)
  res = @provider.responses.create(prompt, params)
  @messages.concat [LLM::Message.new(params[:role] || :user, prompt)]
  @messages.concat messages
  @messages.concat [res.choices[-1]]
  res
end

#usageLLM::Object

Note:

Returns token usage for the conversation This method returns token usage for the latest assistant message, and it returns an empty object if there are no assistant messages

Returns:



121
122
123
# File 'lib/llm/bot.rb', line 121

def usage
  @messages.find(&:assistant?)&.usage || LLM::Object.from({})
end