Autoflux::OpenAI
This gem implements autoflux agent to use OpenAI as the backend.
Installation
Install the gem and add to the application's Gemfile by executing:
bundle add autoflux-openai
If bundler is not being used to manage dependencies, install the gem by executing:
gem install autoflux-openai
Usage
The use the agent, set the OPENAI_API_KEY
environment variable that the agent can use to authenticate with OpenAI.
agent = Autoflux::OpenAI::Agent.new(model: "gpt-4o-mini")
res = agent.call("Hello, world!")
# => "Hello, world!" from OpenAI
The agent will return
content
from the response as the interface of theautoflux
agent.
If you want to use as multi agents the agent name can be set.
agent = Autoflux::OpenAI::Agent.new(name: "shopping", model: "gpt-4o-mini")
res = agent.call("Hello, world!")
# => "Hello, world!" from OpenAI
Tool
You can attach tool to the agent to give it more capabilities.
uppercase = Autoflux::OpenAI::Tool.new(
name: "uppercase",
description: "Convert the content to uppercase",
parameters: {
type: "object",
properties: {
text: {
type: "string"
}
}
}
) do |params|
{ text: params[:text].upcase }
end
agent = Autoflux::OpenAI::Agent.new(
model: "gpt-4o-mini",
tools: [uppercase],
memory: [
{ role: "system", content: "Always transform the user input and don't do anything else." }
]
)
res = agent.call("Hello, world!")
# => "HELLO, WORLD!" from OpenAI
Client
This gem embeds a lightweight client to interact with OpenAI API. The client can be used to interact with OpenAI API directly.
client = Autoflux::OpenAI::Client.new(api_key: "your-api-key")
res = client.call(
model: "gpt-4o-mini",
messages: [{ role: "user", content: "Hello, world!" }]
)
# => { choices: [{ message: { role: "assistant", content: "Hello World!" }}] }
If your api key or endpoint is not default, you can specify them in the client.
client = Autoflux::OpenAI::Client.new(
api_key: "your-api-key",
endpoint: URI("https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/openai")
)
client.call(
model: "gpt-4o-mini",
messages: [{ role: "user", content: "Hello, world!" }]
)
Memory
The agent default use a Ruby array to store the conversation history. If you want to use a different memory store, you can implement the Autoflux::OpenAI::_Memory
interface and pass it to the agent.
class MyMemory
def initialize
@store = []
end
def push()
@store.push()
end
def <<()
push()
end
def to_a
@store.last(100)
end
end
Development
After checking out the repo, run bin/setup
to install dependencies. Then, run rake spec
to run the tests. You can also run bin/console
for an interactive prompt that will allow you to experiment.
To install this gem onto your local machine, run bundle exec rake install
. To release a new version, update the version number in version.rb
, and then run bundle exec rake release
, which will create a git tag for the version, push git commits and the created tag, and push the .gem
file to rubygems.org.
Contributing
Bug reports and pull requests are welcome on GitHub at https://github.com/elct9620/autoflux-openai.
License
The gem is available as open source under the terms of the Apache License 2.0.