Class: LLM::OpenAI
- Defined in:
- lib/llm/providers/openai.rb,
lib/llm/providers/openai/audio.rb,
lib/llm/providers/openai/files.rb,
lib/llm/providers/openai/images.rb,
lib/llm/providers/openai/models.rb,
lib/llm/providers/openai/responses.rb,
lib/llm/providers/openai/moderations.rb,
lib/llm/providers/openai/error_handler.rb,
lib/llm/providers/openai/stream_parser.rb,
lib/llm/providers/openai/vector_stores.rb,
lib/llm/providers/openai/request_adapter.rb,
lib/llm/providers/openai/response_adapter.rb,
lib/llm/providers/openai/responses/stream_parser.rb
Overview
The OpenAI class implements a provider for OpenAI.
Defined Under Namespace
Classes: Audio, Files, Images, Models, Moderations, Responses, VectorStores
Constant Summary collapse
- HOST =
"api.openai.com"
Instance Method Summary collapse
-
#assistant_role ⇒ String
Returns the role of the assistant in the conversation.
-
#audio ⇒ LLM::OpenAI::Audio
Provides an interface to OpenAI's audio generation API.
-
#complete(prompt, params = {}) ⇒ LLM::Response
Provides an interface to the chat completions API.
-
#default_model ⇒ String
Returns the default model for chat completions.
-
#embed(input, model: "text-embedding-3-small", **params) ⇒ LLM::Response
Provides an embedding.
-
#files ⇒ LLM::OpenAI::Files
Provides an interface to OpenAI's files API.
-
#images ⇒ LLM::OpenAI::Images
Provides an interface to OpenAI's image generation API.
-
#initialize ⇒ OpenAI
constructor
A new instance of OpenAI.
-
#models ⇒ LLM::OpenAI::Models
Provides an interface to OpenAI's models API.
-
#moderations ⇒ LLM::OpenAI::Moderations
Provides an interface to OpenAI's moderation API.
-
#responses ⇒ LLM::OpenAI::Responses
Provides an interface to OpenAI's response API.
- #server_tools ⇒ String => LLM::ServerTool
-
#vector_stores ⇒ LLM::OpenAI::VectorStore
Provides an interface to OpenAI's vector store API.
-
#web_search(query:) ⇒ LLM::Response
A convenience method for performing a web search using the OpenAI web search tool.
Methods inherited from Provider
#chat, clients, #developer_role, #inspect, #respond, #schema, #server_tool, #system_role, #user_role, #with
Constructor Details
#initialize ⇒ OpenAI
Returns a new instance of OpenAI.
35 36 37 |
# File 'lib/llm/providers/openai.rb', line 35 def initialize(**) super(host: HOST, **) end |
Instance Method Details
#assistant_role ⇒ String
Returns the role of the assistant in the conversation. Usually "assistant" or "model"
131 132 133 |
# File 'lib/llm/providers/openai.rb', line 131 def assistant_role "assistant" end |
#audio ⇒ LLM::OpenAI::Audio
Provides an interface to OpenAI's audio generation API
92 93 94 |
# File 'lib/llm/providers/openai.rb', line 92 def audio LLM::OpenAI::Audio.new(self) end |
#complete(prompt, params = {}) ⇒ LLM::Response
Provides an interface to the chat completions API
64 65 66 67 68 69 70 |
# File 'lib/llm/providers/openai.rb', line 64 def complete(prompt, params = {}) params, stream, tools, role = normalize_complete_params(params) req = build_complete_request(prompt, params, role) res = execute(request: req, stream: stream) ResponseAdapter.adapt(res, type: :completion) .extend(Module.new { define_method(:__tools__) { tools } }) end |
#default_model ⇒ String
Returns the default model for chat completions
139 140 141 |
# File 'lib/llm/providers/openai.rb', line 139 def default_model "gpt-4.1" end |
#embed(input, model: "text-embedding-3-small", **params) ⇒ LLM::Response
Provides an embedding
47 48 49 50 51 52 |
# File 'lib/llm/providers/openai.rb', line 47 def (input, model: "text-embedding-3-small", **params) req = Net::HTTP::Post.new("/v1/embeddings", headers) req.body = LLM.json.dump({input:, model:}.merge!(params)) res = execute(request: req) ResponseAdapter.adapt(res, type: :embedding) end |
#files ⇒ LLM::OpenAI::Files
Provides an interface to OpenAI's files API
100 101 102 |
# File 'lib/llm/providers/openai.rb', line 100 def files LLM::OpenAI::Files.new(self) end |
#images ⇒ LLM::OpenAI::Images
Provides an interface to OpenAI's image generation API
84 85 86 |
# File 'lib/llm/providers/openai.rb', line 84 def images LLM::OpenAI::Images.new(self) end |
#models ⇒ LLM::OpenAI::Models
Provides an interface to OpenAI's models API
108 109 110 |
# File 'lib/llm/providers/openai.rb', line 108 def models LLM::OpenAI::Models.new(self) end |
#moderations ⇒ LLM::OpenAI::Moderations
Provides an interface to OpenAI's moderation API
117 118 119 |
# File 'lib/llm/providers/openai.rb', line 117 def moderations LLM::OpenAI::Moderations.new(self) end |
#responses ⇒ LLM::OpenAI::Responses
Provides an interface to OpenAI's response API
76 77 78 |
# File 'lib/llm/providers/openai.rb', line 76 def responses LLM::OpenAI::Responses.new(self) end |
#server_tools ⇒ String => LLM::ServerTool
This method includes certain tools that require configuration through a set of options that are easier to set through the LLM::Provider#server_tool method.
149 150 151 152 153 154 155 156 157 |
# File 'lib/llm/providers/openai.rb', line 149 def server_tools { web_search: server_tool(:web_search), file_search: server_tool(:file_search), image_generation: server_tool(:image_generation), code_interpreter: server_tool(:code_interpreter), computer_use: server_tool(:computer_use) } end |
#vector_stores ⇒ LLM::OpenAI::VectorStore
Provides an interface to OpenAI's vector store API
125 126 127 |
# File 'lib/llm/providers/openai.rb', line 125 def vector_stores LLM::OpenAI::VectorStores.new(self) end |
#web_search(query:) ⇒ LLM::Response
A convenience method for performing a web search using the OpenAI web search tool.
168 169 170 171 172 173 |
# File 'lib/llm/providers/openai.rb', line 168 def web_search(query:) ResponseAdapter.adapt( responses.create(query, store: false, tools: [server_tools[:web_search]]), type: :web_search ) end |