Class: Langchain::Assistant::LLM::Adapter
- Inherits:
-
Object
- Object
- Langchain::Assistant::LLM::Adapter
- Defined in:
- lib/langchain/assistant/llm/adapter.rb
Overview
TODO: Fix the message truncation when context window is exceeded
Class Method Summary collapse
Class Method Details
.build(llm) ⇒ Object
8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
# File 'lib/langchain/assistant/llm/adapter.rb', line 8 def self.build(llm) if llm.is_a?(Langchain::LLM::Anthropic) LLM::Adapters::Anthropic.new elsif llm.is_a?(Langchain::LLM::AwsBedrock) && llm.defaults[:chat_model].include?("anthropic") LLM::Adapters::AwsBedrockAnthropic.new elsif llm.is_a?(Langchain::LLM::GoogleGemini) || llm.is_a?(Langchain::LLM::GoogleVertexAI) LLM::Adapters::GoogleGemini.new elsif llm.is_a?(Langchain::LLM::MistralAI) LLM::Adapters::MistralAI.new elsif llm.is_a?(Langchain::LLM::Ollama) LLM::Adapters::Ollama.new elsif llm.is_a?(Langchain::LLM::OpenAI) LLM::Adapters::OpenAI.new else raise ArgumentError, "Unsupported LLM type: #{llm.class}" end end |