Class: LLM::Clients::OpenAI
- Inherits:
-
Object
- Object
- LLM::Clients::OpenAI
- Includes:
- HTTParty
- Defined in:
- lib/llm/clients/open_ai.rb
Instance Method Summary collapse
- #chat(messages, options = {}) ⇒ Object
-
#initialize(llm:) ⇒ OpenAI
constructor
A new instance of OpenAI.
Constructor Details
#initialize(llm:) ⇒ OpenAI
Returns a new instance of OpenAI.
10 11 12 |
# File 'lib/llm/clients/open_ai.rb', line 10 def initialize(llm:) @llm = llm end |
Instance Method Details
#chat(messages, options = {}) ⇒ Object
14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 |
# File 'lib/llm/clients/open_ai.rb', line 14 def chat(, = {}) parameters = { model: @llm.canonical_name, messages: , temperature: [:temperature], response_format: [:response_format], max_tokens: [:max_output_tokens], top_p: [:top_p], stop: [:stop_sequences], presence_penalty: [:presence_penalty], frequency_penalty: [:frequency_penalty], tools: [:tools], tool_choice: [:tool_choice] }.compact return chat_streaming(parameters, [:on_message], [:on_complete]) if [:stream] resp = post_url("/chat/completions", body: parameters.to_json) Response.new(resp).to_normalized_response end |