Class: Roseflow::OpenAI::Client
- Inherits:
-
Object
- Object
- Roseflow::OpenAI::Client
- Defined in:
- lib/roseflow/openai/client.rb
Instance Method Summary collapse
-
#create_chat_completion(model:, messages:, **options) ⇒ OpenAI::TextApiResponse
Creates a chat completion.
-
#create_completion(model:, prompt:, **options) ⇒ OpenAI::TextApiResponse
Creates a text completion for the provided prompt and parameters.
-
#create_edit(model:, instruction:, **options) ⇒ OpenAI::TextApiResponse
Given a prompt and an instruction, the model will return an edited version of the prompt.
-
#create_embedding(model:, input:) ⇒ OpenAI::EmbeddingApiResponse
Creates an embedding vector representing the input text.
- #create_image(prompt:, **options) ⇒ Object
-
#initialize(config = Config.new, provider = nil) ⇒ Client
constructor
A new instance of Client.
-
#models ⇒ Array<OpenAI::Model>
Returns the available models from the API.
-
#post(operation) {|String| ... } ⇒ OpenAI::Response
Posts an operation to the API.
-
#streaming_chat_completion(model:, messages:, **options) {|String| ... } ⇒ Array<String>
Creates a chat completion and streams the response.
-
#streaming_completion(model:, prompt:, **options) {|String| ... } ⇒ Array<String>
Creates a text completion for the provided prompt and parameters and streams the response.
Constructor Details
Instance Method Details
#create_chat_completion(model:, messages:, **options) ⇒ OpenAI::TextApiResponse
Creates a chat completion.
59 60 61 62 63 64 65 66 67 |
# File 'lib/roseflow/openai/client.rb', line 59 def create_chat_completion(model:, messages:, **) response = connection.post("/v1/chat/completions") do |request| request.body = .merge({ model: model.name, messages: , }) end ChatResponse.new(response) end |
#create_completion(model:, prompt:, **options) ⇒ OpenAI::TextApiResponse
Creates a text completion for the provided prompt and parameters.
99 100 101 102 103 104 105 106 107 |
# File 'lib/roseflow/openai/client.rb', line 99 def create_completion(model:, prompt:, **) response = connection.post("/v1/completions") do |request| request.body = .merge({ model: model.name, prompt: prompt, }) end CompletionResponse.new(response) end |
#create_edit(model:, instruction:, **options) ⇒ OpenAI::TextApiResponse
Given a prompt and an instruction, the model will return an edited version of the prompt.
138 139 140 141 142 143 144 145 146 |
# File 'lib/roseflow/openai/client.rb', line 138 def create_edit(model:, instruction:, **) response = connection.post("/v1/edits") do |request| request.body = .merge({ model: model.name, instruction: instruction, }) end EditResponse.new(response) end |
#create_embedding(model:, input:) ⇒ OpenAI::EmbeddingApiResponse
Creates an embedding vector representing the input text.
161 162 163 164 165 166 167 168 169 170 |
# File 'lib/roseflow/openai/client.rb', line 161 def (model:, input:) EmbeddingApiResponse.new( connection.post("/v1/embeddings") do |request| request.body = { model: model.name, input: input, } end ) end |
#create_image(prompt:, **options) ⇒ Object
148 149 150 151 152 153 154 |
# File 'lib/roseflow/openai/client.rb', line 148 def create_image(prompt:, **) ImageApiResponse.new( connection.post("/v1/images/generations") do |request| request.body = .merge(prompt: prompt) end ) end |
#models ⇒ Array<OpenAI::Model>
Returns the available models from the API.
28 29 30 31 32 33 34 |
# File 'lib/roseflow/openai/client.rb', line 28 def models response = connection.get("/v1/models") body = JSON.parse(response.body) body.fetch("data", []).map do |model| OpenAI::Model.new(model, self) end end |
#post(operation) {|String| ... } ⇒ OpenAI::Response
Posts an operation to the API.
41 42 43 44 45 46 47 48 49 50 51 |
# File 'lib/roseflow/openai/client.rb', line 41 def post(operation, &block) response = connection.post(operation.path) do |request| request.body = operation.body if operation.stream request..on_data = Proc.new do |chunk| yield chunk if block_given? end end end response unless block_given? end |
#streaming_chat_completion(model:, messages:, **options) {|String| ... } ⇒ Array<String>
Creates a chat completion and streams the response.
76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 |
# File 'lib/roseflow/openai/client.rb', line 76 def streaming_chat_completion(model:, messages:, **, &block) streamed = [] connection.post("/v1/chat/completions") do |request| .delete(:streaming) request.body = .merge({ model: model.name, messages: , stream: true, }) request..on_data = Proc.new do |chunk| yield streaming_chunk(chunk) if block_given? streamed << chunk unless block_given? end end streamed unless block_given? end |
#streaming_completion(model:, prompt:, **options) {|String| ... } ⇒ Array<String>
Creates a text completion for the provided prompt and parameters and streams the response.
116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 |
# File 'lib/roseflow/openai/client.rb', line 116 def streaming_completion(model:, prompt:, **, &block) streamed = [] connection.post("/v1/completions") do |request| request.body = .merge({ model: model.name, prompt: prompt, stream: true, }) request..on_data = Proc.new do |chunk| yield streaming_chunk(chunk) if block_given? streamed << chunk unless block_given? end end streamed unless block_given? end |