Class: Roseflow::OpenAI::Client

Inherits:
Object
  • Object
show all
Defined in:
lib/roseflow/openai/client.rb

Instance Method Summary collapse

Constructor Details

#initialize(config = Config.new, provider = nil) ⇒ Client

Returns a new instance of Client.



20
21
22
23
# File 'lib/roseflow/openai/client.rb', line 20

def initialize(config = Config.new, provider = nil)
  @config = config
  @provider = provider
end

Instance Method Details

#create_chat_completion(model:, messages:, **options) ⇒ OpenAI::TextApiResponse

Creates a chat completion.

Parameters:

  • model (Roseflow::OpenAI::Model)

    the model to use

  • messages (Array<String>)

    the messages to use

  • options (Hash)

    the options to use

Returns:



59
60
61
62
63
64
65
66
67
# File 'lib/roseflow/openai/client.rb', line 59

def create_chat_completion(model:, messages:, **options)
  response = connection.post("/v1/chat/completions") do |request|
    request.body = options.merge({
      model: model.name,
      messages: messages,
    })
  end
  ChatResponse.new(response)
end

#create_completion(model:, prompt:, **options) ⇒ OpenAI::TextApiResponse

Creates a text completion for the provided prompt and parameters.

Parameters:

  • model (Roseflow::OpenAI::Model)

    the model to use

  • prompt (String)

    the prompt to use

  • options (Hash)

    the options to use

Returns:



99
100
101
102
103
104
105
106
107
# File 'lib/roseflow/openai/client.rb', line 99

def create_completion(model:, prompt:, **options)
  response = connection.post("/v1/completions") do |request|
    request.body = options.merge({
      model: model.name,
      prompt: prompt,
    })
  end
  CompletionResponse.new(response)
end

#create_edit(model:, instruction:, **options) ⇒ OpenAI::TextApiResponse

Given a prompt and an instruction, the model will return an edited version of the prompt.

Parameters:

  • model (String)

    the model to use

  • instruction (String)

    the instruction to use

  • options (Hash)

    the options to use

Returns:



138
139
140
141
142
143
144
145
146
# File 'lib/roseflow/openai/client.rb', line 138

def create_edit(model:, instruction:, **options)
  response = connection.post("/v1/edits") do |request|
    request.body = options.merge({
      model: model.name,
      instruction: instruction,
    })
  end
  EditResponse.new(response)
end

#create_embedding(model:, input:) ⇒ OpenAI::EmbeddingApiResponse

Creates an embedding vector representing the input text.

Parameters:

Returns:



161
162
163
164
165
166
167
168
169
170
# File 'lib/roseflow/openai/client.rb', line 161

def create_embedding(model:, input:)
  EmbeddingApiResponse.new(
    connection.post("/v1/embeddings") do |request|
      request.body = {
        model: model.name,
        input: input,
      }
    end
  )
end

#create_image(prompt:, **options) ⇒ Object



148
149
150
151
152
153
154
# File 'lib/roseflow/openai/client.rb', line 148

def create_image(prompt:, **options)
  ImageApiResponse.new(
    connection.post("/v1/images/generations") do |request|
      request.body = options.merge(prompt: prompt)
    end
  )
end

#modelsArray<OpenAI::Model>

Returns the available models from the API.

Returns:



28
29
30
31
32
33
34
# File 'lib/roseflow/openai/client.rb', line 28

def models
  response = connection.get("/v1/models")
  body = JSON.parse(response.body)
  body.fetch("data", []).map do |model|
    OpenAI::Model.new(model, self)
  end
end

#post(operation) {|String| ... } ⇒ OpenAI::Response

Posts an operation to the API.

Parameters:

  • operation (OpenAI::Operation)

    the operation to post

Yields:

  • (String)

    the streamed API response

Returns:

  • (OpenAI::Response)

    the API response object if no block is given



41
42
43
44
45
46
47
48
49
50
51
# File 'lib/roseflow/openai/client.rb', line 41

def post(operation, &block)
  response = connection.post(operation.path) do |request|
    request.body = operation.body
    if operation.stream
      request.options.on_data = Proc.new do |chunk|
        yield chunk if block_given?
      end
    end
  end
  response unless block_given?
end

#streaming_chat_completion(model:, messages:, **options) {|String| ... } ⇒ Array<String>

Creates a chat completion and streams the response.

Parameters:

  • model (Roseflow::OpenAI::Model)

    the model to use

  • messages (Array<String>)

    the messages to use

  • options (Hash)

    the options to use

Yields:

  • (String)

    the streamed API response

Returns:

  • (Array<String>)

    the streamed API response if no block is given



76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
# File 'lib/roseflow/openai/client.rb', line 76

def streaming_chat_completion(model:, messages:, **options, &block)
  streamed = []
  connection.post("/v1/chat/completions") do |request|
    options.delete(:streaming)
    request.body = options.merge({
      model: model.name,
      messages: messages,
      stream: true,
    })
    request.options.on_data = Proc.new do |chunk|
      yield streaming_chunk(chunk) if block_given?
      streamed << chunk unless block_given?
    end
  end
  streamed unless block_given?
end

#streaming_completion(model:, prompt:, **options) {|String| ... } ⇒ Array<String>

Creates a text completion for the provided prompt and parameters and streams the response.

Parameters:

  • model (Roseflow::OpenAI::Model)

    the model to use

  • prompt (String)

    the prompt to use

  • options (Hash)

    the options to use

Yields:

  • (String)

    the streamed API response

Returns:

  • (Array<String>)

    the streamed API response if no block is given



116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
# File 'lib/roseflow/openai/client.rb', line 116

def streaming_completion(model:, prompt:, **options, &block)
  streamed = []
  connection.post("/v1/completions") do |request|
    request.body = options.merge({
      model: model.name,
      prompt: prompt,
      stream: true,
    })
    request.options.on_data = Proc.new do |chunk|
      yield streaming_chunk(chunk) if block_given?
      streamed << chunk unless block_given?
    end
  end
  streamed unless block_given?
end