Class: Langchain::LLM::Cohere
Overview
Constant Summary collapse
- DEFAULTS =
{ temperature: 0.0, completion_model: "command", chat_model: "command-r-plus", embedding_model: "small", dimensions: 1024, truncate: "START" }.freeze
Instance Attribute Summary
Attributes inherited from Base
Instance Method Summary collapse
-
#chat(params = {}) ⇒ Langchain::LLM::CohereResponse
Generate a chat completion for given messages.
-
#complete(prompt:, **params) ⇒ Langchain::LLM::CohereResponse
Generate a completion for a given prompt.
-
#embed(text:) ⇒ Langchain::LLM::CohereResponse
Generate an embedding for a given text.
-
#initialize(api_key:, default_options: {}) ⇒ Cohere
constructor
A new instance of Cohere.
-
#summarize(text:) ⇒ String
Generate a summary in English for a given text.
Methods inherited from Base
#chat_parameters, #default_dimension, #default_dimensions
Methods included from DependencyHelper
Constructor Details
#initialize(api_key:, default_options: {}) ⇒ Cohere
Returns a new instance of Cohere.
23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 |
# File 'lib/langchain/llm/cohere.rb', line 23 def initialize(api_key:, default_options: {}) depends_on "cohere-ruby", req: "cohere" @client = ::Cohere::Client.new(api_key: api_key) @defaults = DEFAULTS.merge() chat_parameters.update( model: {default: @defaults[:chat_model]}, temperature: {default: @defaults[:temperature]}, response_format: {default: @defaults[:response_format]} ) chat_parameters.remap( system: :preamble, messages: :chat_history, stop: :stop_sequences, top_k: :k, top_p: :p ) end |
Instance Method Details
#chat(params = {}) ⇒ Langchain::LLM::CohereResponse
Generate a chat completion for given messages
96 97 98 99 100 101 102 103 104 105 106 107 108 |
# File 'lib/langchain/llm/cohere.rb', line 96 def chat(params = {}) raise ArgumentError.new("messages argument is required") if Array(params[:messages]).empty? parameters = chat_parameters.to_params(params) # Cohere API requires `message:` parameter to be sent separately from `chat_history:`. # We extract the last message from the messages param. parameters[:message] = parameters[:chat_history].pop&.dig(:message) response = client.chat(**parameters) Langchain::LLM::CohereResponse.new(response) end |
#complete(prompt:, **params) ⇒ Langchain::LLM::CohereResponse
Generate a completion for a given prompt
64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 |
# File 'lib/langchain/llm/cohere.rb', line 64 def complete(prompt:, **params) default_params = { prompt: prompt, temperature: @defaults[:temperature], model: @defaults[:completion_model], truncate: @defaults[:truncate] } if params[:stop_sequences] default_params[:stop_sequences] = params.delete(:stop_sequences) end default_params.merge!(params) response = client.generate(**default_params) Langchain::LLM::CohereResponse.new response, model: @defaults[:completion_model] end |
#embed(text:) ⇒ Langchain::LLM::CohereResponse
Generate an embedding for a given text
48 49 50 51 52 53 54 55 |
# File 'lib/langchain/llm/cohere.rb', line 48 def (text:) response = client.( texts: [text], model: @defaults[:embedding_model] ) Langchain::LLM::CohereResponse.new response, model: @defaults[:embedding_model] end |
#summarize(text:) ⇒ String
Generate a summary in English for a given text
More parameters available to extend this method with: github.com/andreibondarev/cohere-ruby/blob/0.9.4/lib/cohere/client.rb#L107-L115
116 117 118 119 |
# File 'lib/langchain/llm/cohere.rb', line 116 def summarize(text:) response = client.summarize(text: text) response.dig("summary") end |