Module: RubyLLM::Providers::OpenAI::Chat
- Included in:
- RubyLLM::Providers::OpenAI
- Defined in:
- lib/ruby_llm/providers/openai/chat.rb
Overview
Chat methods of the OpenAI API integration
Class Method Summary collapse
- .format_messages(messages) ⇒ Object
- .format_role(role) ⇒ Object
- .parse_completion_response(response) ⇒ Object
-
.render_payload(messages, tools:, temperature:, model:, stream: false, schema: nil) ⇒ Object
rubocop:disable Metrics/ParameterLists.
Instance Method Summary collapse
Class Method Details
.format_messages(messages) ⇒ Object
66 67 68 69 70 71 72 73 74 75 |
# File 'lib/ruby_llm/providers/openai/chat.rb', line 66 def () .map do |msg| { role: format_role(msg.role), content: Media.format_content(msg.content), tool_calls: format_tool_calls(msg.tool_calls), tool_call_id: msg.tool_call_id }.compact end end |
.format_role(role) ⇒ Object
77 78 79 80 81 82 83 84 |
# File 'lib/ruby_llm/providers/openai/chat.rb', line 77 def format_role(role) case role when :system @config.openai_use_system_role ? 'system' : 'developer' else role.to_s end end |
.parse_completion_response(response) ⇒ Object
41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 |
# File 'lib/ruby_llm/providers/openai/chat.rb', line 41 def parse_completion_response(response) data = response.body return if data.empty? raise Error.new(response, data.dig('error', 'message')) if data.dig('error', 'message') = data.dig('choices', 0, 'message') return unless usage = data['usage'] || {} cached_tokens = usage.dig('prompt_tokens_details', 'cached_tokens') Message.new( role: :assistant, content: ['content'], tool_calls: parse_tool_calls(['tool_calls']), input_tokens: usage['prompt_tokens'], output_tokens: usage['completion_tokens'], cached_tokens: cached_tokens, cache_creation_tokens: 0, model_id: data['model'], raw: response ) end |
.render_payload(messages, tools:, temperature:, model:, stream: false, schema: nil) ⇒ Object
rubocop:disable Metrics/ParameterLists
14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 |
# File 'lib/ruby_llm/providers/openai/chat.rb', line 14 def render_payload(, tools:, temperature:, model:, stream: false, schema: nil) # rubocop:disable Metrics/ParameterLists payload = { model: model.id, messages: (), stream: stream } payload[:temperature] = temperature unless temperature.nil? payload[:tools] = tools.map { |_, tool| tool_for(tool) } if tools.any? if schema strict = schema[:strict] != false payload[:response_format] = { type: 'json_schema', json_schema: { name: 'response', schema: schema, strict: strict } } end payload[:stream_options] = { include_usage: true } if stream payload end |
Instance Method Details
#completion_url ⇒ Object
8 9 10 |
# File 'lib/ruby_llm/providers/openai/chat.rb', line 8 def completion_url 'chat/completions' end |