Class: LLM::Agent
- Inherits:
-
Object
- Object
- LLM::Agent
- Defined in:
- lib/llm/agent.rb
Overview
Instructions are injected only on the first request.
This idea originally came from RubyLLM and was adapted to llm.rb.
LLM::Agent provides a class-level DSL for defining reusable, preconfigured assistants with defaults for model, tools, schema, and instructions.
Unlike LLM::Bot, this class will automatically run tool calls for you.
Class Method Summary collapse
-
.instructions(instructions = nil) ⇒ String?
Set or get the default instructions.
-
.model(model = nil) ⇒ String?
Set or get the default model.
-
.schema(schema = nil) ⇒ #to_json?
Set or get the default schema.
-
.tools(*tools) ⇒ Array<LLM::Function>
Set or get the default tools.
Instance Method Summary collapse
- #build_prompt ⇒ LLM::Builder
-
#chat(prompt, params = {}) ⇒ LLM::Response
Maintain a conversation via the chat completions API.
- #functions ⇒ Array<LLM::Function>
-
#image_url(url) ⇒ LLM::Object
Returns a tagged object.
-
#initialize(provider, params = {}) ⇒ Agent
constructor
A new instance of Agent.
-
#local_file(path) ⇒ LLM::Object
Returns a tagged object.
- #messages ⇒ LLM::Buffer<LLM::Message>
-
#remote_file(res) ⇒ LLM::Object
Returns a tagged object.
-
#respond(prompt, params = {}) ⇒ LLM::Response
Maintain a conversation via the responses API.
- #usage ⇒ LLM::Object
Constructor Details
#initialize(provider, params = {}) ⇒ Agent
Returns a new instance of Agent.
85 86 87 88 89 90 |
# File 'lib/llm/agent.rb', line 85 def initialize(provider, params = {}) defaults = {model: self.class.model, tools: self.class.tools, schema: self.class.schema}.compact @provider = provider @bot = LLM::Bot.new(provider, defaults.merge(params)) @instructions_applied = false end |
Class Method Details
.instructions(instructions = nil) ⇒ String?
Set or get the default instructions
70 71 72 73 |
# File 'lib/llm/agent.rb', line 70 def self.instructions(instructions = nil) return @instructions if instructions.nil? @instructions = instructions end |
.model(model = nil) ⇒ String?
Set or get the default model
37 38 39 40 |
# File 'lib/llm/agent.rb', line 37 def self.model(model = nil) return @model if model.nil? @model = model end |
.schema(schema = nil) ⇒ #to_json?
Set or get the default schema
59 60 61 62 |
# File 'lib/llm/agent.rb', line 59 def self.schema(schema = nil) return @schema if schema.nil? @schema = schema end |
.tools(*tools) ⇒ Array<LLM::Function>
Set or get the default tools
48 49 50 51 |
# File 'lib/llm/agent.rb', line 48 def self.tools(*tools) return @tools || [] if tools.empty? @tools = tools.flatten end |
Instance Method Details
#build_prompt ⇒ LLM::Builder
163 164 165 |
# File 'lib/llm/agent.rb', line 163 def build_prompt(&) @bot.build_prompt(&) end |
#chat(prompt, params = {}) ⇒ LLM::Response
Maintain a conversation via the chat completions API. This method immediately sends a request to the LLM and returns the response.
105 106 107 108 109 110 111 112 113 114 115 |
# File 'lib/llm/agent.rb', line 105 def chat(prompt, params = {}) i, max = 0, Integer(params.delete(:max_tool_rounds) || 10) res = @bot.chat(apply_instructions(prompt), params) until @bot.functions.empty? raise LLM::ToolLoopError, "pending tool calls remain" if i >= max res = @bot.chat @bot.functions.map(&:call), params i += 1 end @instructions_applied = true res end |
#functions ⇒ Array<LLM::Function>
151 152 153 |
# File 'lib/llm/agent.rb', line 151 def functions @bot.functions end |
#image_url(url) ⇒ LLM::Object
Returns a tagged object
172 173 174 |
# File 'lib/llm/agent.rb', line 172 def image_url(url) @bot.image_url(url) end |
#local_file(path) ⇒ LLM::Object
Returns a tagged object
181 182 183 |
# File 'lib/llm/agent.rb', line 181 def local_file(path) @bot.local_file(path) end |
#remote_file(res) ⇒ LLM::Object
Returns a tagged object
190 191 192 |
# File 'lib/llm/agent.rb', line 190 def remote_file(res) @bot.remote_file(res) end |
#respond(prompt, params = {}) ⇒ LLM::Response
Not all LLM providers support this API
Maintain a conversation via the responses API. This method immediately sends a request to the LLM and returns the response.
131 132 133 134 135 136 137 138 139 140 141 |
# File 'lib/llm/agent.rb', line 131 def respond(prompt, params = {}) i, max = 0, Integer(params.delete(:max_tool_rounds) || 10) res = @bot.respond(apply_instructions(prompt), params) until @bot.functions.empty? raise LLM::ToolLoopError, "pending tool calls remain" if i >= max res = @bot.respond @bot.functions.map(&:call), params i += 1 end @instructions_applied = true res end |