Class: LLM::Session
- Inherits:
-
Object
- Object
- LLM::Session
- Includes:
- Deserializer
- Defined in:
- lib/llm/bot.rb,
lib/llm/session/deserializer.rb
Overview
LLM::Session provides an object that can maintain a conversation. A conversation can use the chat completions API that all LLM providers support or the responses API that currently only OpenAI supports.
Defined Under Namespace
Modules: Deserializer
Instance Attribute Summary collapse
-
#messages ⇒ LLM::Buffer<LLM::Message>
readonly
Returns an Enumerable for the messages in a conversation.
Instance Method Summary collapse
-
#deserialize(path: nil, string: nil) ⇒ LLM::Session
(also: #restore)
Restore a session.
-
#functions ⇒ Array<LLM::Function>
Returns an array of functions that can be called.
-
#image_url(url) ⇒ LLM::Object
Recongize an object as a URL to an image.
-
#initialize(provider, params = {}) ⇒ Session
constructor
A new instance of Session.
- #inspect ⇒ String
-
#local_file(path) ⇒ LLM::Object
Recongize an object as a local file.
-
#model ⇒ String
Returns the model a Session is actively using.
-
#prompt(&b) ⇒ LLM::Prompt
(also: #build_prompt)
Build a role-aware prompt for a single request.
-
#remote_file(res) ⇒ LLM::Object
Reconginize an object as a remote file.
-
#respond(prompt, params = {}) ⇒ LLM::Response
Maintain a conversation via the responses API.
-
#serialize(path:)
(also: #save)
Save a session.
-
#talk(prompt, params = {}) ⇒ LLM::Response
(also: #chat)
Maintain a conversation via the chat completions API.
- #to_h ⇒ Hash
- #to_json ⇒ String
-
#tracer ⇒ LLM::Tracer
Returns an LLM tracer.
-
#usage ⇒ LLM::Object
Returns token usage for the conversation This method returns token usage for the latest assistant message, and it returns an empty object if there are no assistant messages.
Methods included from Deserializer
Constructor Details
#initialize(provider, params = {}) ⇒ Session
Returns a new instance of Session.
43 44 45 46 47 |
# File 'lib/llm/bot.rb', line 43 def initialize(provider, params = {}) @provider = provider @params = {model: provider.default_model, schema: nil}.compact.merge!(params) @messages = LLM::Buffer.new(provider) end |
Instance Attribute Details
#messages ⇒ LLM::Buffer<LLM::Message> (readonly)
Returns an Enumerable for the messages in a conversation
32 33 34 |
# File 'lib/llm/bot.rb', line 32 def @messages end |
Instance Method Details
#deserialize(path: nil, string: nil) ⇒ LLM::Session Also known as: restore
Restore a session
235 236 237 238 239 240 241 242 243 244 245 246 |
# File 'lib/llm/bot.rb', line 235 def deserialize(path: nil, string: nil) payload = if path.nil? and string.nil? raise ArgumentError, "a path or string is required" elsif path ::File.binread(path) else string end ses = LLM.json.load(payload) @messages.concat [*ses["messages"]].map { (_1) } self end |
#functions ⇒ Array<LLM::Function>
Returns an array of functions that can be called
112 113 114 115 116 117 118 119 120 121 122 |
# File 'lib/llm/bot.rb', line 112 def functions @messages .select(&:assistant?) .flat_map do |msg| fns = msg.functions.select(&:pending?) fns.each do |fn| fn.tracer = tracer fn.model = msg.model end end end |
#image_url(url) ⇒ LLM::Object
Recongize an object as a URL to an image
161 162 163 |
# File 'lib/llm/bot.rb', line 161 def image_url(url) LLM::Object.from(value: url, kind: :image_url) end |
#inspect ⇒ String
103 104 105 106 107 |
# File 'lib/llm/bot.rb', line 103 def inspect "#<#{self.class.name}:0x#{object_id.to_s(16)} " \ "@provider=#{@provider.class}, @params=#{@params.inspect}, " \ "@messages=#{@messages.inspect}>" end |
#local_file(path) ⇒ LLM::Object
Recongize an object as a local file
171 172 173 |
# File 'lib/llm/bot.rb', line 171 def local_file(path) LLM::Object.from(value: LLM.File(path), kind: :local_file) end |
#model ⇒ String
Returns the model a Session is actively using
195 196 197 |
# File 'lib/llm/bot.rb', line 195 def model .find(&:assistant?)&.model || @params[:model] end |
#prompt(&b) ⇒ LLM::Prompt Also known as: build_prompt
Build a role-aware prompt for a single request.
Prefer this method over #build_prompt. The older method name is kept for backward compatibility.
150 151 152 |
# File 'lib/llm/bot.rb', line 150 def prompt(&b) LLM::Prompt.new(@provider, &b) end |
#remote_file(res) ⇒ LLM::Object
Reconginize an object as a remote file
181 182 183 |
# File 'lib/llm/bot.rb', line 181 def remote_file(res) LLM::Object.from(value: res, kind: :remote_file) end |
#respond(prompt, params = {}) ⇒ LLM::Response
Not all LLM providers support this API
Maintain a conversation via the responses API. This method immediately sends a request to the LLM and returns the response.
88 89 90 91 92 93 94 95 96 97 98 99 |
# File 'lib/llm/bot.rb', line 88 def respond(prompt, params = {}) prompt, params, = fetch(prompt, params) res_id = @messages.find(&:assistant?)&.response&.response_id params = params.merge(previous_response_id: res_id, input: ).compact params = @params.merge(params) res = @provider.responses.create(prompt, params) role = params[:role] || @provider.user_role @messages.concat [LLM::Message.new(role, prompt)] @messages.concat @messages.concat [res.choices[-1]] res end |
#serialize(path:) Also known as: save
This method returns an undefined value.
Save a session
221 222 223 |
# File 'lib/llm/bot.rb', line 221 def serialize(path:) ::File.binwrite path, LLM.json.dump(self) end |
#talk(prompt, params = {}) ⇒ LLM::Response Also known as: chat
Maintain a conversation via the chat completions API. This method immediately sends a request to the LLM and returns the response.
61 62 63 64 65 66 67 68 69 70 71 72 |
# File 'lib/llm/bot.rb', line 61 def talk(prompt, params = {}) prompt, params, = fetch(prompt, params) params = params.merge(messages: [*@messages.to_a, *]) params = @params.merge(params) res = @provider.complete(prompt, params) role = params[:role] || @provider.user_role role = @provider.tool_role if params[:role].nil? && [*prompt].grep(LLM::Function::Return).any? @messages.concat [LLM::Message.new(role, prompt)] @messages.concat @messages.concat [res.choices[-1]] res end |
#to_h ⇒ Hash
201 202 203 |
# File 'lib/llm/bot.rb', line 201 def to_h {model:, messages:} end |
#to_json ⇒ String
207 208 209 |
# File 'lib/llm/bot.rb', line 207 def to_json(...) {schema_version: 1}.merge!(to_h).to_json(...) end |
#tracer ⇒ LLM::Tracer
Returns an LLM tracer
188 189 190 |
# File 'lib/llm/bot.rb', line 188 def tracer @provider.tracer end |
#usage ⇒ LLM::Object
Returns token usage for the conversation This method returns token usage for the latest assistant message, and it returns an empty object if there are no assistant messages
131 132 133 |
# File 'lib/llm/bot.rb', line 131 def usage @messages.find(&:assistant?)&.usage || LLM::Object.from({}) end |