Class: Vectorsearch::Base
- Inherits:
-
Object
- Object
- Vectorsearch::Base
- Defined in:
- lib/vectorsearch/base.rb
Constant Summary collapse
- DEFAULT_METRIC =
"cosine".freeze
- DEFAULT_COHERE_DIMENSION =
1024
- DEFAULT_OPENAI_DIMENSION =
1536
- LLMS =
Currently supported LLMs TODO: Add support for HuggingFace
%i[openai cohere].freeze
Instance Attribute Summary collapse
-
#client ⇒ Object
readonly
Returns the value of attribute client.
-
#index_name ⇒ Object
readonly
Returns the value of attribute index_name.
-
#llm ⇒ Object
readonly
Returns the value of attribute llm.
-
#llm_api_key ⇒ Object
readonly
Returns the value of attribute llm_api_key.
Instance Method Summary collapse
-
#add_texts(texts:) ⇒ Object
TODO.
-
#ask(question:) ⇒ Object
NotImplementedError will be raised if the subclass does not implement the ‘ask()` method.
- #create_default_schema ⇒ Object
-
#generate_completion(prompt:) ⇒ String
Generate a completion for a given prompt Currently supports OpenAI and Cohere The LLM-related method will most likely need to be abstracted out into a separate class.
-
#generate_embedding(text:) ⇒ String
Generate an embedding for a given text Currently supports OpenAI and Cohere The LLM-related method will most likely need to be abstracted out into a separate class.
- #generate_prompt(question:, context:) ⇒ Object
-
#initialize(llm:, llm_api_key:) ⇒ Base
constructor
A new instance of Base.
Constructor Details
#initialize(llm:, llm_api_key:) ⇒ Base
Returns a new instance of Base.
20 21 22 23 24 25 |
# File 'lib/vectorsearch/base.rb', line 20 def initialize(llm:, llm_api_key:) validate_llm!(llm: llm) @llm = llm @llm_api_key = llm_api_key end |
Instance Attribute Details
#client ⇒ Object (readonly)
Returns the value of attribute client.
8 9 10 |
# File 'lib/vectorsearch/base.rb', line 8 def client @client end |
#index_name ⇒ Object (readonly)
Returns the value of attribute index_name.
8 9 10 |
# File 'lib/vectorsearch/base.rb', line 8 def index_name @index_name end |
#llm ⇒ Object (readonly)
Returns the value of attribute llm.
8 9 10 |
# File 'lib/vectorsearch/base.rb', line 8 def llm @llm end |
#llm_api_key ⇒ Object (readonly)
Returns the value of attribute llm_api_key.
8 9 10 |
# File 'lib/vectorsearch/base.rb', line 8 def llm_api_key @llm_api_key end |
Instance Method Details
#add_texts(texts:) ⇒ Object
TODO
32 33 34 |
# File 'lib/vectorsearch/base.rb', line 32 def add_texts(texts:) raise NotImplementedError end |
#ask(question:) ⇒ Object
NotImplementedError will be raised if the subclass does not implement the ‘ask()` method
37 38 39 |
# File 'lib/vectorsearch/base.rb', line 37 def ask(question:) raise NotImplementedError end |
#create_default_schema ⇒ Object
27 28 29 |
# File 'lib/vectorsearch/base.rb', line 27 def create_default_schema raise NotImplementedError end |
#generate_completion(prompt:) ⇒ String
Generate a completion for a given prompt Currently supports OpenAI and Cohere The LLM-related method will most likely need to be abstracted out into a separate class
70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 |
# File 'lib/vectorsearch/base.rb', line 70 def generate_completion(prompt:) case llm when :openai response = openai_client.completions( parameters: { model: "text-davinci-003", temperature: 0.0, prompt: prompt } ) response.dig("choices").first.dig("text") when :cohere response = cohere_client.generate( prompt: prompt, temperature: 0.0 ) response.dig("generations").first.dig("text") end end |
#generate_embedding(text:) ⇒ String
Generate an embedding for a given text Currently supports OpenAI and Cohere The LLM-related method will most likely need to be abstracted out into a separate class
46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 |
# File 'lib/vectorsearch/base.rb', line 46 def (text:) case llm when :openai response = openai_client.( parameters: { model: "text-embedding-ada-002", input: text } ) response.dig("data").first.dig("embedding") when :cohere response = cohere_client.( texts: [text], model: "small" ) response.dig("embeddings").first end end |
#generate_prompt(question:, context:) ⇒ Object
90 91 92 93 94 95 96 97 |
# File 'lib/vectorsearch/base.rb', line 90 def generate_prompt(question:, context:) "Context:\n" + "#{context}\n" + "---\n" + "Question: #{question}\n" + "---\n" + "Answer:" end |