Class: Vectorsearch::Base

Inherits:
Object
  • Object
show all
Defined in:
lib/vectorsearch/base.rb

Direct Known Subclasses

Milvus, Pinecone, Qdrant, Weaviate

Constant Summary collapse

DEFAULT_METRIC =
"cosine".freeze
DEFAULT_COHERE_DIMENSION =
1024
DEFAULT_OPENAI_DIMENSION =
1536
LLMS =

Currently supported LLMs TODO: Add support for HuggingFace

%i[openai cohere].freeze

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(llm:, llm_api_key:) ⇒ Base

Returns a new instance of Base.

Parameters:

  • llm (Symbol)

    The LLM to use

  • llm_api_key (String)

    The API key for the LLM



20
21
22
23
24
25
# File 'lib/vectorsearch/base.rb', line 20

def initialize(llm:, llm_api_key:)
  validate_llm!(llm: llm)

  @llm = llm
  @llm_api_key = llm_api_key
end

Instance Attribute Details

#clientObject (readonly)

Returns the value of attribute client.



8
9
10
# File 'lib/vectorsearch/base.rb', line 8

def client
  @client
end

#index_nameObject (readonly)

Returns the value of attribute index_name.



8
9
10
# File 'lib/vectorsearch/base.rb', line 8

def index_name
  @index_name
end

#llmObject (readonly)

Returns the value of attribute llm.



8
9
10
# File 'lib/vectorsearch/base.rb', line 8

def llm
  @llm
end

#llm_api_keyObject (readonly)

Returns the value of attribute llm_api_key.



8
9
10
# File 'lib/vectorsearch/base.rb', line 8

def llm_api_key
  @llm_api_key
end

Instance Method Details

#add_texts(texts:) ⇒ Object

TODO

Raises:

  • (NotImplementedError)


32
33
34
# File 'lib/vectorsearch/base.rb', line 32

def add_texts(texts:)
  raise NotImplementedError
end

#ask(question:) ⇒ Object

NotImplementedError will be raised if the subclass does not implement the ‘ask()` method

Raises:

  • (NotImplementedError)


37
38
39
# File 'lib/vectorsearch/base.rb', line 37

def ask(question:)
  raise NotImplementedError
end

#create_default_schemaObject

Raises:

  • (NotImplementedError)


27
28
29
# File 'lib/vectorsearch/base.rb', line 27

def create_default_schema
  raise NotImplementedError
end

#generate_completion(prompt:) ⇒ String

Generate a completion for a given prompt Currently supports OpenAI and Cohere The LLM-related method will most likely need to be abstracted out into a separate class

Parameters:

  • prompt (String)

    The prompt to generate a completion for

Returns:

  • (String)

    The completion



70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
# File 'lib/vectorsearch/base.rb', line 70

def generate_completion(prompt:)
  case llm
  when :openai
    response = openai_client.completions(
      parameters: {
        model: "text-davinci-003",
        temperature: 0.0,
        prompt: prompt
      }
    )
    response.dig("choices").first.dig("text")
  when :cohere
    response = cohere_client.generate(
      prompt: prompt,
      temperature: 0.0
    )
    response.dig("generations").first.dig("text")
  end
end

#generate_embedding(text:) ⇒ String

Generate an embedding for a given text Currently supports OpenAI and Cohere The LLM-related method will most likely need to be abstracted out into a separate class

Parameters:

  • text (String)

    The text to generate an embedding for

Returns:

  • (String)

    The embedding



46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
# File 'lib/vectorsearch/base.rb', line 46

def generate_embedding(text:)
  case llm
  when :openai
    response = openai_client.embeddings(
      parameters: {
        model: "text-embedding-ada-002",
        input: text
      }
    )
    response.dig("data").first.dig("embedding")
  when :cohere
    response = cohere_client.embed(
      texts: [text],
      model: "small"
    )
    response.dig("embeddings").first
  end
end

#generate_prompt(question:, context:) ⇒ Object



90
91
92
93
94
95
96
97
# File 'lib/vectorsearch/base.rb', line 90

def generate_prompt(question:, context:)
  "Context:\n" +
  "#{context}\n" +
  "---\n" +
  "Question: #{question}\n" +
  "---\n" +
  "Answer:"
end