Class: Langchain::Assistant::LLM::Adapters::Ollama
- Defined in:
- lib/langchain/assistant/llm/adapters/ollama.rb
Instance Method Summary collapse
-
#allowed_tool_choices ⇒ Object
Get the allowed assistant.tool_choice values for Ollama.
-
#available_tool_names(tools) ⇒ Object
Build the tools for the Ollama LLM.
-
#build_chat_params(messages:, instructions:, tools:, tool_choice:, parallel_tool_calls:) ⇒ Hash
Build the chat parameters for the Ollama LLM.
-
#build_message(role:, content: nil, image_url: nil, tool_calls: [], tool_call_id: nil) ⇒ Messages::OllamaMessage
Build an Ollama message.
-
#extract_tool_call_args(tool_call:) ⇒ Array
Extract the tool call information from the OpenAI tool call hash.
- #support_system_message? ⇒ Boolean
- #tool_role ⇒ Object
Instance Method Details
#allowed_tool_choices ⇒ Object
Get the allowed assistant.tool_choice values for Ollama
73 74 75 |
# File 'lib/langchain/assistant/llm/adapters/ollama.rb', line 73 def allowed_tool_choices ["auto", "none"] end |
#available_tool_names(tools) ⇒ Object
Build the tools for the Ollama LLM
68 69 70 |
# File 'lib/langchain/assistant/llm/adapters/ollama.rb', line 68 def available_tool_names(tools) build_tools(tools).map { |tool| tool.dig(:function, :name) } end |
#build_chat_params(messages:, instructions:, tools:, tool_choice:, parallel_tool_calls:) ⇒ Hash
Build the chat parameters for the Ollama LLM
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 |
# File 'lib/langchain/assistant/llm/adapters/ollama.rb', line 16 def build_chat_params( messages:, instructions:, tools:, tool_choice:, parallel_tool_calls: ) Langchain.logger.warn "WARNING: `parallel_tool_calls:` is not supported by Ollama currently" Langchain.logger.warn "WARNING: `tool_choice:` is not supported by Ollama currently" params = {messages: } if tools.any? params[:tools] = build_tools(tools) end params end |
#build_message(role:, content: nil, image_url: nil, tool_calls: [], tool_call_id: nil) ⇒ Messages::OllamaMessage
Build an Ollama message
41 42 43 44 45 |
# File 'lib/langchain/assistant/llm/adapters/ollama.rb', line 41 def (role:, content: nil, image_url: nil, tool_calls: [], tool_call_id: nil) Langchain.logger.warn "WARNING: Image URL is not supported by Ollama currently" if image_url Messages::OllamaMessage.new(role: role, content: content, tool_calls: tool_calls, tool_call_id: tool_call_id) end |
#extract_tool_call_args(tool_call:) ⇒ Array
Extract the tool call information from the OpenAI tool call hash
51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 |
# File 'lib/langchain/assistant/llm/adapters/ollama.rb', line 51 def extract_tool_call_args(tool_call:) tool_call_id = tool_call.dig("id") function_name = tool_call.dig("function", "name") tool_name, method_name = function_name.split("__") tool_arguments = tool_call.dig("function", "arguments") tool_arguments = if tool_arguments.is_a?(Hash) Langchain::Utils::HashTransformer.symbolize_keys(tool_arguments) else JSON.parse(tool_arguments, symbolize_names: true) end [tool_call_id, tool_name, method_name, tool_arguments] end |
#support_system_message? ⇒ Boolean
81 82 83 |
# File 'lib/langchain/assistant/llm/adapters/ollama.rb', line 81 def Messages::OllamaMessage::ROLES.include?("system") end |
#tool_role ⇒ Object
77 78 79 |
# File 'lib/langchain/assistant/llm/adapters/ollama.rb', line 77 def tool_role Messages::OllamaMessage::TOOL_ROLE end |