Class: Agentic::LlmClient
- Inherits:
-
Object
- Object
- Agentic::LlmClient
- Defined in:
- lib/agentic/llm_client.rb
Overview
Generic wrapper for LLM API clients
Instance Attribute Summary collapse
-
#client ⇒ OpenAI::Client
readonly
The underlying LLM client instance.
-
#last_response ⇒ OpenAI::Client
readonly
The underlying LLM client instance.
Instance Method Summary collapse
-
#complete(messages, output_schema: nil) ⇒ Hash
Sends a completion request to the LLM.
-
#initialize(config) ⇒ LlmClient
constructor
Initializes a new LlmClient.
-
#models ⇒ Array<Hash>
Fetches available models from the LLM provider.
-
#query_generation_stats(generation_id) ⇒ Hash
Queries generation stats for a given generation ID.
Constructor Details
#initialize(config) ⇒ LlmClient
Initializes a new LlmClient
13 14 15 16 17 |
# File 'lib/agentic/llm_client.rb', line 13 def initialize(config) @client = OpenAI::Client.new(access_token: Agentic.configuration.access_token) @config = config @last_response = nil end |
Instance Attribute Details
#client ⇒ OpenAI::Client (readonly)
Returns The underlying LLM client instance.
9 10 11 |
# File 'lib/agentic/llm_client.rb', line 9 def client @client end |
#last_response ⇒ OpenAI::Client (readonly)
Returns The underlying LLM client instance.
9 10 11 |
# File 'lib/agentic/llm_client.rb', line 9 def last_response @last_response end |
Instance Method Details
#complete(messages, output_schema: nil) ⇒ Hash
Sends a completion request to the LLM
22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 |
# File 'lib/agentic/llm_client.rb', line 22 def complete(, output_schema: nil) parameters = {model: @config.model, messages: } if output_schema parameters[:response_format] = { type: "json_schema", json_schema: output_schema.to_hash } end @last_response = client.chat(parameters: parameters) if output_schema content = JSON.parse(@last_response.dig("choices", 0, "message", "content")) if (refusal = @last_response.dig("choices", 0, "message", "refusal")) {refusal: refusal, content: nil} else {content: content} end else @last_response.dig("choices", 0, "message", "content") end end |
#models ⇒ Array<Hash>
Fetches available models from the LLM provider
48 49 50 |
# File 'lib/agentic/llm_client.rb', line 48 def models client.models.list&.dig("data") end |
#query_generation_stats(generation_id) ⇒ Hash
Queries generation stats for a given generation ID
55 56 57 |
# File 'lib/agentic/llm_client.rb', line 55 def query_generation_stats(generation_id) client.query_generation_stats(generation_id) end |