GenAI

✨ Generative AI toolset for Ruby ✨

GenAI allows you to easily integrate Generative AI model providers like OpenAI, Google Vertex AI, Stability AI, etc. Easily add Large Language Models, Stable Diffusion image generation, and other AI model integrations into your application!

Tests Gem Version License

Installation

Install the gem and add to the application's Gemfile by executing:

$ bundle add gen-ai

If bundler is not being used to manage dependencies, install the gem by executing:

$ gem install gen-ai

Usage

Require it in you code:

require 'gen_ai'

Feature support

✅ - Supported | ❌ - Not supported | 🛠️ - Work in progress

Language models capabilities

Provider Embedding Completion Conversation Sentiment Summarization
OpenAI 🛠️ 🛠️
Google Palm2 🛠️ 🛠️
Google Gemini 🛠️ 🛠️ 🛠️
Anthropic 🛠️ 🛠️

Image generation model capabilities

Provider Generate Variations Edit Upscale
OpenAI
StabilityAI

Language

Instantiate a language model client by passing a provider name and an API token.

model = GenAI::Language.new(:open_ai, ENV['OPEN_AI_TOKEN'])

Generate embedding(s) for text using provider/model that fits your needs

result = model.embed('Hi, how are you?')
# => #<GenAI::Result:0x0000000110be6f20...>

result.value
# =>  [-0.013577374, 0.0021624255, 0.0019274801, ... ]

result = model.embed(['Hello', 'Bonjour', 'Cześć'])
# => #<GenAI::Result:0x0000000110be6f34...>

result.values
# =>  [[-0.021834826, -0.007176527, -0.02836839,, ... ], [...], [...]]

Generate text completions using Large Language Models

result = model.complete('London is a ', temperature: 0, max_tokens: 11)
# => #<GenAI::Result:0x0000000110be6d21...>

result.value
# => "vibrant and diverse city located in the United Kingdom"


result = model.complete('London is a ', max_tokens: 12, n: 2)
# => #<GenAI::Result:0x0000000110c25c70...>

result.values
# => ["thriving, bustling city known for its rich history.", "major global city and the capital of the United Kingdom."]

Chat

Have a conversation with Large Language Model and Build your own AI chatbot.

Setting a context for the conversation is optional, but it helps the model to understand the topic of the conversation.

chat = GenAI::Chat.new(:open_ai, ENV['OPEN_AI_TOKEN'])
chat.start(context: "You are a chat bot named Erl")
chat.message("Hi, what's your name")
# = >#<GenAI::Result:0x0000000106ff3d20...>

result.value
# => "I am a chatbot and you can call me Erl. How can I help you?""

Provider a history of the conversation to the model to help it understand the context of the conversation.

history = [
    {role: 'user', content: 'What is the capital of Great Britain?'},
    {role: 'assistant', content: 'London'},
]

chat = GenAI::Chat.new(:open_ai, ENV['OPEN_AI_TOKEN'])
result = model.start(history: history)
result = model.message("what about France?")
# => #<GenAI::Result:0x00000001033c3bc0...>

result.value
# => "Paris"

Image

Instantiate a image generation model client by passing a provider name and an API token.

model = GenAI::Image.new(:open_ai, ENV['OPEN_AI_TOKEN'])

Generate image(s) using provider/model that fits your needs

result = model.generate('A painting of a dog')
# => #<GenAI::Result:0x0000000110be6f20...>

result.value
# => image binary

result.value(:base64)
# => image in base64

# Save image to file
File.open('dog.jpg', 'wb') do |f|
  f.write(result.value)
end

dog

Get more variations of the same image

result = model.variations('./dog.jpg')
# => #<GenAI::Result:0x0000000116a1ec50...>

result.value
# => image binary

result.value(:base64)
# => image in base64

# Save image to file
File.open('dog_variation.jpg', 'wb') do |f|
  f.write(result.value)
end

dog_variation

Editing existing images with additional prompt

result = model.edit('./llama.jpg', 'A cute llama wearing a beret', mask: './mask.png')
# => #<GenAI::Result:0x0000000116a1ec50...>

result.value
# => image binary

result.value(:base64)
# => image in base64

# Save image to file
File.open('dog_edited.jpg', 'wb') do |f|
  f.write(result.value)
end

llama llama_edited

Development

After checking out the repo, run bin/setup to install dependencies. Then, run rake spec to run the tests. You can also run bin/console for an interactive prompt that will allow you to experiment.

To install this gem onto your local machine, run bundle exec rake install. To release a new version, update the version number in version.rb, and then run bundle exec rake release, which will create a git tag for the version, push git commits and the created tag, and push the .gem file to rubygems.org.

Contributing

Bug reports and pull requests are welcome on GitHub at https://github.com/alchaplinsky/gen-ai. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the code of conduct.

Code of Conduct

Everyone interacting in the GenAI project's codebases, issue trackers, chat rooms, and mailing lists is expected to follow the code of conduct.