Class: LLM::Gemini
- Defined in:
- lib/llm/providers/gemini.rb,
lib/llm/providers/gemini/audio.rb,
lib/llm/providers/gemini/files.rb,
lib/llm/providers/gemini/images.rb,
lib/llm/providers/gemini/models.rb,
lib/llm/providers/gemini/error_handler.rb,
lib/llm/providers/gemini/stream_parser.rb,
lib/llm/providers/gemini/request_adapter.rb,
lib/llm/providers/gemini/response_adapter.rb
Overview
The Gemini class implements a provider for Gemini. The Gemini provider can accept multiple inputs (text, images, audio, and video). The inputs can be provided inline via the prompt for files under 20MB or via the Gemini Files API for files that are over 20MB.
Defined Under Namespace
Classes: Audio, Files, Images, Models
Constant Summary collapse
- HOST =
"generativelanguage.googleapis.com"
Instance Method Summary collapse
-
#assistant_role ⇒ String
Returns the role of the assistant in the conversation.
-
#audio ⇒ LLM::Gemini::Audio
Provides an interface to Gemini's audio API.
-
#complete(prompt, params = {}) ⇒ LLM::Response
Provides an interface to the chat completions API.
-
#default_model ⇒ String
Returns the default model for chat completions.
-
#developer_role ⇒ Symbol
Returns the providers developer role.
-
#embed(input, model: "gemini-embedding-001", **params) ⇒ LLM::Response
Provides an embedding.
-
#files ⇒ LLM::Gemini::Files
Provides an interface to Gemini's file management API.
-
#images ⇒ see LLM::Gemini::Images
Provides an interface to Gemini's image generation API.
-
#initialize ⇒ Gemini
constructor
A new instance of Gemini.
-
#models ⇒ LLM::Gemini::Models
Provides an interface to Gemini's models API.
- #server_tools ⇒ String => LLM::ServerTool
-
#system_role ⇒ Symbol
Returns the providers system role.
-
#user_role ⇒ Symbol
Returns the providers user role.
-
#web_search(query:) ⇒ LLM::Response
A convenience method for performing a web search using the Google Search tool.
Methods inherited from Provider
#chat, clients, #inspect, #moderations, #persist!, #respond, #responses, #schema, #server_tool, #tool_role, #tracer, #tracer=, #vector_stores, #with
Constructor Details
#initialize ⇒ Gemini
Returns a new instance of Gemini.
36 37 38 |
# File 'lib/llm/providers/gemini.rb', line 36 def initialize(**) super(host: HOST, **) end |
Instance Method Details
#assistant_role ⇒ String
Returns the role of the assistant in the conversation. Usually "assistant" or "model"
165 166 167 |
# File 'lib/llm/providers/gemini.rb', line 165 def assistant_role "model" end |
#audio ⇒ LLM::Gemini::Audio
Provides an interface to Gemini's audio API
82 83 84 |
# File 'lib/llm/providers/gemini.rb', line 82 def audio LLM::Gemini::Audio.new(self) end |
#complete(prompt, params = {}) ⇒ LLM::Response
Provides an interface to the chat completions API
68 69 70 71 72 73 74 75 76 |
# File 'lib/llm/providers/gemini.rb', line 68 def complete(prompt, params = {}) params, stream, tools, role, model = normalize_complete_params(params) req = build_complete_request(prompt, params, role, model, stream) res, span, tracer = execute(request: req, stream: stream, operation: "chat", model:) res = ResponseAdapter.adapt(res, type: :completion) .extend(Module.new { define_method(:__tools__) { tools } }) tracer.on_request_finish(operation: "chat", model:, res:, span:) res end |
#default_model ⇒ String
Returns the default model for chat completions
114 115 116 |
# File 'lib/llm/providers/gemini.rb', line 114 def default_model "gemini-2.5-flash" end |
#developer_role ⇒ Symbol
Returns the providers developer role
159 160 161 |
# File 'lib/llm/providers/gemini.rb', line 159 def developer_role :user end |
#embed(input, model: "gemini-embedding-001", **params) ⇒ LLM::Response
Provides an embedding
47 48 49 50 51 52 53 54 55 56 |
# File 'lib/llm/providers/gemini.rb', line 47 def (input, model: "gemini-embedding-001", **params) model = model.respond_to?(:id) ? model.id : model path = ["/v1beta/models/#{model}", "embedContent?key=#{@key}"].join(":") req = Net::HTTP::Post.new(path, headers) req.body = LLM.json.dump({content: {parts: [{text: input}]}}) res, span, tracer = execute(request: req, operation: "embeddings", model:) res = ResponseAdapter.adapt(res, type: :embedding) tracer.on_request_finish(operation: "embeddings", model:, res:, span:) res end |
#files ⇒ LLM::Gemini::Files
Provides an interface to Gemini's file management API
98 99 100 |
# File 'lib/llm/providers/gemini.rb', line 98 def files LLM::Gemini::Files.new(self) end |
#images ⇒ see LLM::Gemini::Images
Provides an interface to Gemini's image generation API
90 91 92 |
# File 'lib/llm/providers/gemini.rb', line 90 def images LLM::Gemini::Images.new(self) end |
#models ⇒ LLM::Gemini::Models
Provides an interface to Gemini's models API
106 107 108 |
# File 'lib/llm/providers/gemini.rb', line 106 def models LLM::Gemini::Models.new(self) end |
#server_tools ⇒ String => LLM::ServerTool
This method includes certain tools that require configuration through a set of options that are easier to set through the LLM::Provider#server_tool method.
125 126 127 128 129 130 131 |
# File 'lib/llm/providers/gemini.rb', line 125 def server_tools { google_search: server_tool(:google_search), code_execution: server_tool(:code_execution), url_context: server_tool(:url_context) } end |
#system_role ⇒ Symbol
Returns the providers system role
152 153 154 |
# File 'lib/llm/providers/gemini.rb', line 152 def system_role :user end |
#user_role ⇒ Symbol
Returns the providers user role
145 146 147 |
# File 'lib/llm/providers/gemini.rb', line 145 def user_role :user end |
#web_search(query:) ⇒ LLM::Response
A convenience method for performing a web search using the Google Search tool.
138 139 140 |
# File 'lib/llm/providers/gemini.rb', line 138 def web_search(query:) ResponseAdapter.adapt(complete(query, tools: [server_tools[:google_search]]), type: :web_search) end |