Class: Lammy::OpenAI
- Inherits:
-
Object
- Object
- Lammy::OpenAI
- Defined in:
- lib/lammy/openai.rb
Overview
Use the OpenAI API’s Ruby library
Constant Summary collapse
- MODELS =
%w[ gpt-4o gpt-4o-2024-08-06 gpt-4o-2024-05-13 gpt-4o chatgpt-4o-latest gpt-4o-mini gpt-4o-mini-2024-07-18 o1-preview o1-preview-2024-09-12 o1-mini o1-mini-2024-09-12 gpt-3.5-turbo gpt-4-turbo gpt-4-turbo-2024-04-09 gpt-4 gpt-4-32k gpt-4-0125-preview gpt-4-1106-preview gpt-4-vision-preview gpt-3.5-turbo-0125 gpt-3.5-turbo-instruct gpt-3.5-turbo-1106 gpt-3.5-turbo-0613 gpt-3.5-turbo-16k-0613 gpt-3.5-turbo-0301 davinci-002 babbage-002 ].freeze
- EMBEDDINGS =
%w[ text-embedding-3-small text-embedding-3-large text-embedding-ada-002 ].freeze
Instance Attribute Summary collapse
-
#settings ⇒ Object
readonly
Returns the value of attribute settings.
Instance Method Summary collapse
-
#chat(user_message, system_message = nil, stream = nil) ⇒ Object
Generate a response with support for structured output.
-
#embeddings(chunks) ⇒ Object
OpenAI’s text embeddings measure the relatedness of text strings.
-
#initialize(settings) ⇒ OpenAI
constructor
A new instance of OpenAI.
Constructor Details
#initialize(settings) ⇒ OpenAI
Returns a new instance of OpenAI.
32 33 34 |
# File 'lib/lammy/openai.rb', line 32 def initialize(settings) @settings = settings end |
Instance Attribute Details
#settings ⇒ Object (readonly)
Returns the value of attribute settings.
30 31 32 |
# File 'lib/lammy/openai.rb', line 30 def settings @settings end |
Instance Method Details
#chat(user_message, system_message = nil, stream = nil) ⇒ Object
Generate a response with support for structured output
37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 |
# File 'lib/lammy/openai.rb', line 37 def chat(, = nil, stream = nil) schema = schema(settings) = (, ) request = client.chat( parameters: { model: settings[:model], response_format: schema, messages: , stream: stream ? ->(chunk) { stream.call(stream_content(chunk)) } : nil }.compact ) return stream if stream response = request.dig('choices', 0, 'message', 'content') content = schema ? ::Hashie::Mash.new(JSON.parse(response)) : response array?(schema) ? content.items : content end |
#embeddings(chunks) ⇒ Object
OpenAI’s text embeddings measure the relatedness of text strings. An embedding is a vector of floating point numbers. The distance between two vectors measures their relatedness. Small distances suggest high relatedness and large distances suggest low relatedness.
60 61 62 63 64 65 66 67 68 69 70 |
# File 'lib/lammy/openai.rb', line 60 def (chunks) responses = chunks.map do |chunk| response = client.( parameters: { model: settings[:model], dimensions: settings[:dimensions], input: chunk } ) response.dig('data', 0, 'embedding') end responses.one? ? responses.first : responses end |