Class: LLM::Session

Inherits:
Object
  • Object
show all
Includes:
Deserializer
Defined in:
lib/llm/bot.rb,
lib/llm/session/deserializer.rb

Overview

LLM::Session provides an object that can maintain a conversation. A conversation can use the chat completions API that all LLM providers support or the responses API that currently only OpenAI supports.

Examples:

#!/usr/bin/env ruby
require "llm"

llm = LLM.openai(key: ENV["KEY"])
ses = LLM::Session.new(llm)

prompt = LLM::Prompt.new(llm) do
  system "Be concise and show your reasoning briefly."
  user "If a train goes 60 mph for 1.5 hours, how far does it travel?"
  user "Now double the speed for the same time."
end

ses.talk(prompt)
ses.messages.each { |m| puts "[#{m.role}] #{m.content}" }

Defined Under Namespace

Modules: Deserializer

Instance Attribute Summary collapse

Instance Method Summary collapse

Methods included from Deserializer

#deserialize_message

Constructor Details

#initialize(provider, params = {}) ⇒ Session

Returns a new instance of Session.

Parameters:

  • provider (LLM::Provider)

    A provider

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Options Hash (params):

  • :model (String)

    Defaults to the provider's default model

  • :tools (Array<LLM::Function>, nil)

    Defaults to nil



43
44
45
46
47
# File 'lib/llm/bot.rb', line 43

def initialize(provider, params = {})
  @provider = provider
  @params = {model: provider.default_model, schema: nil}.compact.merge!(params)
  @messages = LLM::Buffer.new(provider)
end

Instance Attribute Details

#messagesLLM::Buffer<LLM::Message> (readonly)

Returns an Enumerable for the messages in a conversation



32
33
34
# File 'lib/llm/bot.rb', line 32

def messages
  @messages
end

Instance Method Details

#deserialize(path: nil, string: nil) ⇒ LLM::Session Also known as: restore

Restore a session

Parameters:

  • path (String, nil) (defaults to: nil)

    The path to a JSON file

  • string (String, nil) (defaults to: nil)

    A raw JSON string

Returns:

Raises:

  • (SystemCallError)

    Might raise a number of SystemCallError subclasses



235
236
237
238
239
240
241
242
243
244
245
246
# File 'lib/llm/bot.rb', line 235

def deserialize(path: nil, string: nil)
  payload = if path.nil? and string.nil?
    raise ArgumentError, "a path or string is required"
  elsif path
    ::File.binread(path)
  else
    string
  end
  ses = LLM.json.load(payload)
  @messages.concat [*ses["messages"]].map { deserialize_message(_1) }
  self
end

#functionsArray<LLM::Function>

Returns an array of functions that can be called

Returns:



112
113
114
115
116
117
118
119
120
121
122
# File 'lib/llm/bot.rb', line 112

def functions
  @messages
    .select(&:assistant?)
    .flat_map do |msg|
      fns = msg.functions.select(&:pending?)
      fns.each do |fn|
        fn.tracer = tracer
        fn.model  = msg.model
      end
    end
end

#image_url(url) ⇒ LLM::Object

Recongize an object as a URL to an image

Parameters:

  • url (String)

    The URL

Returns:



161
162
163
# File 'lib/llm/bot.rb', line 161

def image_url(url)
  LLM::Object.from(value: url, kind: :image_url)
end

#inspectString

Returns:

  • (String)


103
104
105
106
107
# File 'lib/llm/bot.rb', line 103

def inspect
  "#<#{self.class.name}:0x#{object_id.to_s(16)} " \
  "@provider=#{@provider.class}, @params=#{@params.inspect}, " \
  "@messages=#{@messages.inspect}>"
end

#local_file(path) ⇒ LLM::Object

Recongize an object as a local file

Parameters:

  • path (String)

    The path

Returns:



171
172
173
# File 'lib/llm/bot.rb', line 171

def local_file(path)
  LLM::Object.from(value: LLM.File(path), kind: :local_file)
end

#modelString

Returns the model a Session is actively using

Returns:

  • (String)


195
196
197
# File 'lib/llm/bot.rb', line 195

def model
  messages.find(&:assistant?)&.model || @params[:model]
end

#prompt(&b) ⇒ LLM::Prompt Also known as: build_prompt

Build a role-aware prompt for a single request.

Prefer this method over #build_prompt. The older method name is kept for backward compatibility.

Examples:

prompt = ses.prompt do
  system "Your task is to assist the user"
  user "Hello, can you assist me?"
end
ses.talk(prompt)

Parameters:

  • b (Proc)

    A block that composes messages. If it takes one argument, it receives the prompt object. Otherwise it runs in prompt context.

Returns:



150
151
152
# File 'lib/llm/bot.rb', line 150

def prompt(&b)
  LLM::Prompt.new(@provider, &b)
end

#remote_file(res) ⇒ LLM::Object

Reconginize an object as a remote file

Parameters:

Returns:



181
182
183
# File 'lib/llm/bot.rb', line 181

def remote_file(res)
  LLM::Object.from(value: res, kind: :remote_file)
end

#respond(prompt, params = {}) ⇒ LLM::Response

Note:

Not all LLM providers support this API

Maintain a conversation via the responses API. This method immediately sends a request to the LLM and returns the response.

Examples:

llm = LLM.openai(key: ENV["KEY"])
ses = LLM::Session.new(llm)
res = ses.respond("What is the capital of France?")
puts res.output_text

Parameters:

  • params (defaults to: {})

    The params, including optional :role (defaults to :user), :stream, :tools, :schema etc.

  • prompt (String)

    The input prompt to be completed

Returns:



88
89
90
91
92
93
94
95
96
97
98
99
# File 'lib/llm/bot.rb', line 88

def respond(prompt, params = {})
  prompt, params, messages = fetch(prompt, params)
  res_id = @messages.find(&:assistant?)&.response&.response_id
  params = params.merge(previous_response_id: res_id, input: messages).compact
  params = @params.merge(params)
  res = @provider.responses.create(prompt, params)
  role = params[:role] || @provider.user_role
  @messages.concat [LLM::Message.new(role, prompt)]
  @messages.concat messages
  @messages.concat [res.choices[-1]]
  res
end

#serialize(path:) Also known as: save

This method returns an undefined value.

Save a session

Examples:

llm = LLM.openai(key: ENV["KEY"])
ses = LLM::Session.new(llm)
ses.talk "Hello"
ses.save(path: "session.json")

Raises:

  • (SystemCallError)

    Might raise a number of SystemCallError subclasses



221
222
223
# File 'lib/llm/bot.rb', line 221

def serialize(path:)
  ::File.binwrite path, LLM.json.dump(self)
end

#talk(prompt, params = {}) ⇒ LLM::Response Also known as: chat

Maintain a conversation via the chat completions API. This method immediately sends a request to the LLM and returns the response.

Examples:

llm = LLM.openai(key: ENV["KEY"])
ses = LLM::Session.new(llm)
res = ses.talk("Hello, what is your name?")
puts res.messages[0].content

Parameters:

  • params (defaults to: {})

    The params, including optional :role (defaults to :user), :stream, :tools, :schema etc.

  • prompt (String)

    The input prompt to be completed

Returns:



61
62
63
64
65
66
67
68
69
70
71
72
# File 'lib/llm/bot.rb', line 61

def talk(prompt, params = {})
  prompt, params, messages = fetch(prompt, params)
  params = params.merge(messages: [*@messages.to_a, *messages])
  params = @params.merge(params)
  res = @provider.complete(prompt, params)
  role = params[:role] || @provider.user_role
  role = @provider.tool_role if params[:role].nil? && [*prompt].grep(LLM::Function::Return).any?
  @messages.concat [LLM::Message.new(role, prompt)]
  @messages.concat messages
  @messages.concat [res.choices[-1]]
  res
end

#to_hHash

Returns:

  • (Hash)


201
202
203
# File 'lib/llm/bot.rb', line 201

def to_h
  {model:, messages:}
end

#to_jsonString

Returns:

  • (String)


207
208
209
# File 'lib/llm/bot.rb', line 207

def to_json(...)
  {schema_version: 1}.merge!(to_h).to_json(...)
end

#tracerLLM::Tracer

Returns an LLM tracer

Returns:



188
189
190
# File 'lib/llm/bot.rb', line 188

def tracer
  @provider.tracer
end

#usageLLM::Object

Note:

Returns token usage for the conversation This method returns token usage for the latest assistant message, and it returns an empty object if there are no assistant messages

Returns:



131
132
133
# File 'lib/llm/bot.rb', line 131

def usage
  @messages.find(&:assistant?)&.usage || LLM::Object.from({})
end