Class: OpenAI::Resources::Chat::Completions
- Inherits:
-
Object
- Object
- OpenAI::Resources::Chat::Completions
- Defined in:
- lib/openai/resources/chat/completions.rb,
lib/openai/resources/chat/completions/messages.rb
Defined Under Namespace
Classes: Messages
Instance Attribute Summary collapse
Instance Method Summary collapse
-
#create(messages: , model: , audio: nil, frequency_penalty: nil, function_call: nil, functions: nil, logit_bias: nil, logprobs: nil, max_completion_tokens: nil, max_tokens: nil, metadata: nil, modalities: nil, n: nil, parallel_tool_calls: nil, prediction: nil, presence_penalty: nil, prompt_cache_key: nil, reasoning_effort: nil, response_format: nil, safety_identifier: nil, seed: nil, service_tier: nil, stop: nil, store: nil, stream_options: nil, temperature: nil, tool_choice: nil, tools: nil, top_logprobs: nil, top_p: nil, user: nil, web_search_options: nil, request_options: {}) ⇒ OpenAI::Models::Chat::ChatCompletion
See #stream_raw for streaming counterpart.
-
#delete(completion_id, request_options: {}) ⇒ OpenAI::Models::Chat::ChatCompletionDeleted
Delete a stored chat completion.
-
#initialize(client:) ⇒ Completions
constructor
private
A new instance of Completions.
-
#list(after: nil, limit: nil, metadata: nil, model: nil, order: nil, request_options: {}) ⇒ OpenAI::Internal::CursorPage<OpenAI::Models::Chat::ChatCompletion>
Some parameter documentations has been truncated, see Models::Chat::CompletionListParams for more details.
-
#retrieve(completion_id, request_options: {}) ⇒ OpenAI::Models::Chat::ChatCompletion
Get a stored chat completion.
- #stream ⇒ void
-
#stream_raw(messages: , model: , audio: nil, frequency_penalty: nil, function_call: nil, functions: nil, logit_bias: nil, logprobs: nil, max_completion_tokens: nil, max_tokens: nil, metadata: nil, modalities: nil, n: nil, parallel_tool_calls: nil, prediction: nil, presence_penalty: nil, prompt_cache_key: nil, reasoning_effort: nil, response_format: nil, safety_identifier: nil, seed: nil, service_tier: nil, stop: nil, store: nil, stream_options: nil, temperature: nil, tool_choice: nil, tools: nil, top_logprobs: nil, top_p: nil, user: nil, web_search_options: nil, request_options: {}) ⇒ OpenAI::Internal::Stream<OpenAI::Models::Chat::ChatCompletionChunk>
See #create for non-streaming counterpart.
-
#update(completion_id, metadata: , request_options: {}) ⇒ OpenAI::Models::Chat::ChatCompletion
Some parameter documentations has been truncated, see Models::Chat::CompletionUpdateParams for more details.
Constructor Details
#initialize(client:) ⇒ Completions
This method is part of a private API. You should avoid using this method if possible, as it may be removed or be changed in the future.
Returns a new instance of Completions.
431 432 433 434 |
# File 'lib/openai/resources/chat/completions.rb', line 431 def initialize(client:) @client = client @messages = OpenAI::Resources::Chat::Completions::Messages.new(client: client) end |
Instance Attribute Details
#messages ⇒ OpenAI::Resources::Chat::Completions::Messages (readonly)
8 9 10 |
# File 'lib/openai/resources/chat/completions.rb', line 8 def @messages end |
Instance Method Details
#create(messages: , model: , audio: nil, frequency_penalty: nil, function_call: nil, functions: nil, logit_bias: nil, logprobs: nil, max_completion_tokens: nil, max_tokens: nil, metadata: nil, modalities: nil, n: nil, parallel_tool_calls: nil, prediction: nil, presence_penalty: nil, prompt_cache_key: nil, reasoning_effort: nil, response_format: nil, safety_identifier: nil, seed: nil, service_tier: nil, stop: nil, store: nil, stream_options: nil, temperature: nil, tool_choice: nil, tools: nil, top_logprobs: nil, top_p: nil, user: nil, web_search_options: nil, request_options: {}) ⇒ OpenAI::Models::Chat::ChatCompletion
See #stream_raw for streaming counterpart.
Some parameter documentations has been truncated, see Models::Chat::CompletionCreateParams for more details.
Starting a new project? We recommend trying Responses to take advantage of the latest OpenAI platform features. Compare Chat Completions with Responses.
Creates a model response for the given chat conversation. Learn more in the text generation, vision, and audio guides.
Parameter support can differ depending on the model used to generate the response, particularly for newer reasoning models. Parameters that are only supported for reasoning models are noted below. For the current state of unsupported parameters in reasoning models, refer to the reasoning guide.
104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 |
# File 'lib/openai/resources/chat/completions.rb', line 104 def create(params) parsed, = OpenAI::Chat::CompletionCreateParams.dump_request(params) if parsed[:stream] = "Please use `#stream_raw` for the streaming use case." raise ArgumentError.new() end model = nil tool_models = {} case parsed in {response_format: OpenAI::StructuredOutput::JsonSchemaConverter => model} parsed.update( response_format: { type: :json_schema, json_schema: { strict: true, name: model.name.split("::").last, schema: model.to_json_schema } } ) in {response_format: {type: :json_schema, json_schema: OpenAI::StructuredOutput::JsonSchemaConverter => model}} parsed.fetch(:response_format).update( json_schema: { strict: true, name: model.name.split("::").last, schema: model.to_json_schema } ) in {response_format: {type: :json_schema, json_schema: {schema: OpenAI::StructuredOutput::JsonSchemaConverter => model}}} parsed.dig(:response_format, :json_schema).store(:schema, model.to_json_schema) in {tools: Array => tools} mapped = tools.map do |tool| case tool in OpenAI::StructuredOutput::JsonSchemaConverter name = tool.name.split("::").last tool_models.store(name, tool) { type: :function, function: { strict: true, name: name, parameters: tool.to_json_schema } } in {function: {parameters: OpenAI::StructuredOutput::JsonSchemaConverter => params}} func = tool.fetch(:function) name = func[:name] ||= params.name.split("::").last tool_models.store(name, params) func.update(parameters: params.to_json_schema) tool else tool end end tools.replace(mapped) else end # rubocop:disable Metrics/BlockLength unwrap = ->(raw) do if model.is_a?(OpenAI::StructuredOutput::JsonSchemaConverter) raw[:choices]&.each do |choice| = choice.fetch(:message) begin parsed = JSON.parse(.fetch(:content), symbolize_names: true) rescue JSON::ParserError => e parsed = e end coerced = OpenAI::Internal::Type::Converter.coerce(model, parsed) .store(:parsed, coerced) end end raw[:choices]&.each do |choice| choice.dig(:message, :tool_calls)&.each do |tool_call| func = tool_call.fetch(:function) next if (model = tool_models[func.fetch(:name)]).nil? begin parsed = JSON.parse(func.fetch(:arguments), symbolize_names: true) rescue JSON::ParserError => e parsed = e end coerced = OpenAI::Internal::Type::Converter.coerce(model, parsed) func.store(:parsed, coerced) end end raw end # rubocop:enable Metrics/BlockLength @client.request( method: :post, path: "chat/completions", body: parsed, unwrap: unwrap, model: OpenAI::Chat::ChatCompletion, options: ) end |
#delete(completion_id, request_options: {}) ⇒ OpenAI::Models::Chat::ChatCompletionDeleted
Delete a stored chat completion. Only Chat Completions that have been created
with the store
parameter set to true
can be deleted.
419 420 421 422 423 424 425 426 |
# File 'lib/openai/resources/chat/completions.rb', line 419 def delete(completion_id, params = {}) @client.request( method: :delete, path: ["chat/completions/%1$s", completion_id], model: OpenAI::Chat::ChatCompletionDeleted, options: params[:request_options] ) end |
#list(after: nil, limit: nil, metadata: nil, model: nil, order: nil, request_options: {}) ⇒ OpenAI::Internal::CursorPage<OpenAI::Models::Chat::ChatCompletion>
Some parameter documentations has been truncated, see Models::Chat::CompletionListParams for more details.
List stored Chat Completions. Only Chat Completions that have been stored with
the store
parameter set to true
will be returned.
395 396 397 398 399 400 401 402 403 404 405 |
# File 'lib/openai/resources/chat/completions.rb', line 395 def list(params = {}) parsed, = OpenAI::Chat::CompletionListParams.dump_request(params) @client.request( method: :get, path: "chat/completions", query: parsed, page: OpenAI::Internal::CursorPage, model: OpenAI::Chat::ChatCompletion, options: ) end |
#retrieve(completion_id, request_options: {}) ⇒ OpenAI::Models::Chat::ChatCompletion
Get a stored chat completion. Only Chat Completions that have been created with
the store
parameter set to true
will be returned.
334 335 336 337 338 339 340 341 |
# File 'lib/openai/resources/chat/completions.rb', line 334 def retrieve(completion_id, params = {}) @client.request( method: :get, path: ["chat/completions/%1$s", completion_id], model: OpenAI::Chat::ChatCompletion, options: params[:request_options] ) end |
#stream ⇒ void
206 207 208 |
# File 'lib/openai/resources/chat/completions.rb', line 206 def stream raise NotImplementedError.new("higher level helpers are coming soon!") end |
#stream_raw(messages: , model: , audio: nil, frequency_penalty: nil, function_call: nil, functions: nil, logit_bias: nil, logprobs: nil, max_completion_tokens: nil, max_tokens: nil, metadata: nil, modalities: nil, n: nil, parallel_tool_calls: nil, prediction: nil, presence_penalty: nil, prompt_cache_key: nil, reasoning_effort: nil, response_format: nil, safety_identifier: nil, seed: nil, service_tier: nil, stop: nil, store: nil, stream_options: nil, temperature: nil, tool_choice: nil, tools: nil, top_logprobs: nil, top_p: nil, user: nil, web_search_options: nil, request_options: {}) ⇒ OpenAI::Internal::Stream<OpenAI::Models::Chat::ChatCompletionChunk>
See #create for non-streaming counterpart.
Some parameter documentations has been truncated, see Models::Chat::CompletionCreateParams for more details.
Starting a new project? We recommend trying Responses to take advantage of the latest OpenAI platform features. Compare Chat Completions with Responses.
Creates a model response for the given chat conversation. Learn more in the text generation, vision, and audio guides.
Parameter support can differ depending on the model used to generate the response, particularly for newer reasoning models. Parameters that are only supported for reasoning models are noted below. For the current state of unsupported parameters in reasoning models, refer to the reasoning guide.
304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 |
# File 'lib/openai/resources/chat/completions.rb', line 304 def stream_raw(params) parsed, = OpenAI::Chat::CompletionCreateParams.dump_request(params) unless parsed.fetch(:stream, true) = "Please use `#create` for the non-streaming use case." raise ArgumentError.new() end parsed.store(:stream, true) @client.request( method: :post, path: "chat/completions", headers: {"accept" => "text/event-stream"}, body: parsed, stream: OpenAI::Internal::Stream, model: OpenAI::Chat::ChatCompletionChunk, options: ) end |
#update(completion_id, metadata: , request_options: {}) ⇒ OpenAI::Models::Chat::ChatCompletion
Some parameter documentations has been truncated, see Models::Chat::CompletionUpdateParams for more details.
Modify a stored chat completion. Only Chat Completions that have been created
with the store
parameter set to true
can be modified. Currently, the only
supported modification is to update the metadata
field.
361 362 363 364 365 366 367 368 369 370 |
# File 'lib/openai/resources/chat/completions.rb', line 361 def update(completion_id, params) parsed, = OpenAI::Chat::CompletionUpdateParams.dump_request(params) @client.request( method: :post, path: ["chat/completions/%1$s", completion_id], body: parsed, model: OpenAI::Chat::ChatCompletion, options: ) end |