Class: Langchain::LLM::Base Abstract
- Inherits:
-
Object
- Object
- Langchain::LLM::Base
- Includes:
- DependencyHelper
- Defined in:
- lib/langchain/llm/base.rb
Overview
A LLM is a language model consisting of a neural network with many parameters (typically billions of weights or more), trained on large quantities of unlabeled text using self-supervised learning or semi-supervised learning.
Langchain.rb provides a common interface to interact with all supported LLMs:
Direct Known Subclasses
AI21, Anthropic, AwsBedrock, Cohere, GoogleGemini, GoogleVertexAI, HuggingFace, LlamaCpp, MistralAI, Ollama, OpenAI, Replicate
Instance Attribute Summary collapse
-
#client ⇒ Object
A client for communicating with the LLM.
-
#defaults ⇒ Object
readonly
Default LLM options.
Instance Method Summary collapse
-
#chat ⇒ Object
Generate a chat completion for a given prompt.
-
#chat_parameters(params = {}) ⇒ Object
Returns an instance of Langchain::LLM::Parameters::Chat.
-
#complete ⇒ Object
Generate a completion for a given prompt.
-
#default_dimension ⇒ Object
Ensuring backward compatibility after github.com/patterns-ai-core/langchainrb/pull/586 TODO: Delete this method later.
-
#default_dimensions ⇒ Integer
Returns the number of vector dimensions used by DEFAULTS.
-
#embed ⇒ Object
Generate an embedding for a given text.
-
#summarize ⇒ Object
Generate a summary for a given text.
Methods included from DependencyHelper
Instance Attribute Details
#client ⇒ Object
A client for communicating with the LLM
24 25 26 |
# File 'lib/langchain/llm/base.rb', line 24 def client @client end |
#defaults ⇒ Object (readonly)
Default LLM options. Can be overridden by passing ‘default_options: {}` to the Langchain::LLM::* constructors.
27 28 29 |
# File 'lib/langchain/llm/base.rb', line 27 def defaults @defaults end |
Instance Method Details
#chat ⇒ Object
Generate a chat completion for a given prompt. Parameters will depend on the LLM
46 47 48 |
# File 'lib/langchain/llm/base.rb', line 46 def chat(...) raise NotImplementedError, "#{self.class.name} does not support chat" end |
#chat_parameters(params = {}) ⇒ Object
Returns an instance of Langchain::LLM::Parameters::Chat
79 80 81 82 83 |
# File 'lib/langchain/llm/base.rb', line 79 def chat_parameters(params = {}) @chat_parameters ||= Langchain::LLM::Parameters::Chat.new( parameters: params ) end |
#complete ⇒ Object
Generate a completion for a given prompt. Parameters will depend on the LLM.
54 55 56 |
# File 'lib/langchain/llm/base.rb', line 54 def complete(...) raise NotImplementedError, "#{self.class.name} does not support completion" end |
#default_dimension ⇒ Object
Ensuring backward compatibility after github.com/patterns-ai-core/langchainrb/pull/586 TODO: Delete this method later
31 32 33 |
# File 'lib/langchain/llm/base.rb', line 31 def default_dimension default_dimensions end |
#default_dimensions ⇒ Integer
Returns the number of vector dimensions used by DEFAULTS
38 39 40 |
# File 'lib/langchain/llm/base.rb', line 38 def default_dimensions self.class.const_get(:DEFAULTS).dig(:dimensions) end |
#embed ⇒ Object
Generate an embedding for a given text. Parameters depends on the LLM.
63 64 65 |
# File 'lib/langchain/llm/base.rb', line 63 def (...) raise NotImplementedError, "#{self.class.name} does not support generating embeddings" end |
#summarize ⇒ Object
Generate a summary for a given text. Parameters depends on the LLM.
72 73 74 |
# File 'lib/langchain/llm/base.rb', line 72 def summarize(...) raise NotImplementedError, "#{self.class.name} does not support summarization" end |