Class: Langchain::LLM::Base Abstract
- Inherits:
-
Object
- Object
- Langchain::LLM::Base
- Includes:
- DependencyHelper
- Defined in:
- lib/langchain/llm/base.rb
Overview
A LLM is a language model consisting of a neural network with many parameters (typically billions of weights or more), trained on large quantities of unlabeled text using self-supervised learning or semi-supervised learning.
Langchain.rb provides a common interface to interact with all supported LLMs:
Direct Known Subclasses
AI21, Anthropic, AwsBedrock, Cohere, GoogleGemini, GoogleVertexAI, HuggingFace, LlamaCpp, MistralAI, Ollama, OpenAI, Replicate
Instance Attribute Summary collapse
-
#client ⇒ Object
A client for communicating with the LLM.
-
#defaults ⇒ Object
readonly
Default LLM options.
Instance Method Summary collapse
-
#chat ⇒ Object
Generate a chat completion for a given prompt.
-
#chat_parameters(params = {}) ⇒ Object
Returns an instance of Langchain::LLM::Parameters::Chat.
-
#complete ⇒ Object
Generate a completion for a given prompt.
-
#default_dimension ⇒ Object
Ensuring backward compatibility after github.com/patterns-ai-core/langchainrb/pull/586 TODO: Delete this method later.
-
#default_dimensions ⇒ Integer
Returns the number of vector dimensions used by DEFAULTS.
-
#embed ⇒ Object
Generate an embedding for a given text.
-
#summarize ⇒ Object
Generate a summary for a given text.
Methods included from DependencyHelper
Instance Attribute Details
#client ⇒ Object
A client for communicating with the LLM
26 27 28 |
# File 'lib/langchain/llm/base.rb', line 26 def client @client end |
#defaults ⇒ Object (readonly)
Default LLM options. Can be overridden by passing ‘default_options: {}` to the Langchain::LLM::* constructors.
29 30 31 |
# File 'lib/langchain/llm/base.rb', line 29 def defaults @defaults end |
Instance Method Details
#chat ⇒ Object
Generate a chat completion for a given prompt. Parameters will depend on the LLM
48 49 50 |
# File 'lib/langchain/llm/base.rb', line 48 def chat(...) raise NotImplementedError, "#{self.class.name} does not support chat" end |
#chat_parameters(params = {}) ⇒ Object
Returns an instance of Langchain::LLM::Parameters::Chat
81 82 83 84 85 |
# File 'lib/langchain/llm/base.rb', line 81 def chat_parameters(params = {}) @chat_parameters ||= Langchain::LLM::Parameters::Chat.new( parameters: params ) end |
#complete ⇒ Object
Generate a completion for a given prompt. Parameters will depend on the LLM.
56 57 58 |
# File 'lib/langchain/llm/base.rb', line 56 def complete(...) raise NotImplementedError, "#{self.class.name} does not support completion" end |
#default_dimension ⇒ Object
Ensuring backward compatibility after github.com/patterns-ai-core/langchainrb/pull/586 TODO: Delete this method later
33 34 35 |
# File 'lib/langchain/llm/base.rb', line 33 def default_dimension default_dimensions end |
#default_dimensions ⇒ Integer
Returns the number of vector dimensions used by DEFAULTS
40 41 42 |
# File 'lib/langchain/llm/base.rb', line 40 def default_dimensions self.class.const_get(:DEFAULTS).dig(:dimensions) end |
#embed ⇒ Object
Generate an embedding for a given text. Parameters depends on the LLM.
65 66 67 |
# File 'lib/langchain/llm/base.rb', line 65 def (...) raise NotImplementedError, "#{self.class.name} does not support generating embeddings" end |
#summarize ⇒ Object
Generate a summary for a given text. Parameters depends on the LLM.
74 75 76 |
# File 'lib/langchain/llm/base.rb', line 74 def summarize(...) raise NotImplementedError, "#{self.class.name} does not support summarization" end |