Class: LLM::Agent

Inherits:
Object
  • Object
show all
Defined in:
lib/llm/agent.rb

Overview

Note:
Note:

Instructions are injected only on the first request.

Note:

This idea originally came from RubyLLM and was adapted to llm.rb.

LLM::Agent provides a class-level DSL for defining reusable, preconfigured assistants with defaults for model, tools, schema, and instructions.

Unlike LLM::Bot, this class will automatically run tool calls for you.

Examples:

class SystemAdmin < LLM::Agent
  model "gpt-4.1-nano"
  instructions "You are a Linux system admin"
  tools Shell
  schema Result
end

llm = LLM.openai(key: ENV["KEY"])
agent = SystemAdmin.new(llm)
agent.chat("Run 'date'")

Class Method Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(provider, params = {}) ⇒ Agent

Returns a new instance of Agent.

Parameters:

  • provider (LLM::Provider)

    A provider

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Options Hash (params):

  • :model (String)

    Defaults to the provider's default model

  • :tools (Array<LLM::Function>, nil)

    Defaults to nil

  • :schema (#to_json, nil)

    Defaults to nil



85
86
87
88
89
90
# File 'lib/llm/agent.rb', line 85

def initialize(provider, params = {})
  defaults = {model: self.class.model, tools: self.class.tools, schema: self.class.schema}.compact
  @provider = provider
  @bot = LLM::Bot.new(provider, defaults.merge(params))
  @instructions_applied = false
end

Class Method Details

.instructions(instructions = nil) ⇒ String?

Set or get the default instructions

Parameters:

  • instructions (String, nil) (defaults to: nil)

    The system instructions

Returns:

  • (String, nil)

    Returns the current instructions when no argument is provided



70
71
72
73
# File 'lib/llm/agent.rb', line 70

def self.instructions(instructions = nil)
  return @instructions if instructions.nil?
  @instructions = instructions
end

.model(model = nil) ⇒ String?

Set or get the default model

Parameters:

  • model (String, nil) (defaults to: nil)

    The model identifier

Returns:

  • (String, nil)

    Returns the current model when no argument is provided



37
38
39
40
# File 'lib/llm/agent.rb', line 37

def self.model(model = nil)
  return @model if model.nil?
  @model = model
end

.schema(schema = nil) ⇒ #to_json?

Set or get the default schema

Parameters:

  • schema (#to_json, nil) (defaults to: nil)

    The schema

Returns:

  • (#to_json, nil)

    Returns the current schema when no argument is provided



59
60
61
62
# File 'lib/llm/agent.rb', line 59

def self.schema(schema = nil)
  return @schema if schema.nil?
  @schema = schema
end

.tools(*tools) ⇒ Array<LLM::Function>

Set or get the default tools

Parameters:

Returns:

  • (Array<LLM::Function>)

    Returns the current tools when no argument is provided



48
49
50
51
# File 'lib/llm/agent.rb', line 48

def self.tools(*tools)
  return @tools || [] if tools.empty?
  @tools = tools.flatten
end

Instance Method Details

#build_promptLLM::Builder

Returns:



163
164
165
# File 'lib/llm/agent.rb', line 163

def build_prompt(&)
  @bot.build_prompt(&)
end

#chat(prompt, params = {}) ⇒ LLM::Response

Maintain a conversation via the chat completions API. This method immediately sends a request to the LLM and returns the response.

Examples:

llm = LLM.openai(key: ENV["KEY"])
agent = LLM::Agent.new(llm)
response = agent.chat("Hello, what is your name?")
puts response.choices[0].content

Parameters:

  • params (Hash) (defaults to: {})

    The params passed to the provider, including optional :stream, :tools, :schema etc.

  • prompt (String)

    The input prompt to be completed

Options Hash (params):

  • :max_tool_rounds (Integer)

    The maxinum number of tool call iterations (default 10)

Returns:



105
106
107
108
109
110
111
112
113
114
115
# File 'lib/llm/agent.rb', line 105

def chat(prompt, params = {})
  i, max = 0, Integer(params.delete(:max_tool_rounds) || 10)
  res = @bot.chat(apply_instructions(prompt), params)
  until @bot.functions.empty?
    raise LLM::ToolLoopError, "pending tool calls remain" if i >= max
    res = @bot.chat @bot.functions.map(&:call), params
    i += 1
  end
  @instructions_applied = true
  res
end

#functionsArray<LLM::Function>

Returns:



151
152
153
# File 'lib/llm/agent.rb', line 151

def functions
  @bot.functions
end

#image_url(url) ⇒ LLM::Object

Returns a tagged object

Parameters:

  • url (String)

    The URL

Returns:



172
173
174
# File 'lib/llm/agent.rb', line 172

def image_url(url)
  @bot.image_url(url)
end

#local_file(path) ⇒ LLM::Object

Returns a tagged object

Parameters:

  • path (String)

    The path

Returns:



181
182
183
# File 'lib/llm/agent.rb', line 181

def local_file(path)
  @bot.local_file(path)
end

#messagesLLM::Buffer<LLM::Message>



145
146
147
# File 'lib/llm/agent.rb', line 145

def messages
  @bot.messages
end

#remote_file(res) ⇒ LLM::Object

Returns a tagged object

Parameters:

Returns:



190
191
192
# File 'lib/llm/agent.rb', line 190

def remote_file(res)
  @bot.remote_file(res)
end

#respond(prompt, params = {}) ⇒ LLM::Response

Note:

Not all LLM providers support this API

Maintain a conversation via the responses API. This method immediately sends a request to the LLM and returns the response.

Examples:

llm = LLM.openai(key: ENV["KEY"])
agent = LLM::Agent.new(llm)
res = agent.respond("What is the capital of France?")
puts res.output_text

Parameters:

  • params (Hash) (defaults to: {})

    The params passed to the provider, including optional :stream, :tools, :schema etc.

  • prompt (String)

    The input prompt to be completed

Options Hash (params):

  • :max_tool_rounds (Integer)

    The maxinum number of tool call iterations (default 10)

Returns:



131
132
133
134
135
136
137
138
139
140
141
# File 'lib/llm/agent.rb', line 131

def respond(prompt, params = {})
  i, max = 0, Integer(params.delete(:max_tool_rounds) || 10)
  res = @bot.respond(apply_instructions(prompt), params)
  until @bot.functions.empty?
    raise LLM::ToolLoopError, "pending tool calls remain" if i >= max
    res = @bot.respond @bot.functions.map(&:call), params
    i += 1
  end
  @instructions_applied = true
  res
end

#usageLLM::Object

Returns:



157
158
159
# File 'lib/llm/agent.rb', line 157

def usage
  @bot.usage
end