Class: Langchain::Assistant::LLM::Adapters::Ollama

Inherits:
Base
  • Object
show all
Defined in:
lib/langchain/assistant/llm/adapters/ollama.rb

Instance Method Summary collapse

Instance Method Details

#allowed_tool_choicesObject

Get the allowed assistant.tool_choice values for Ollama



71
72
73
# File 'lib/langchain/assistant/llm/adapters/ollama.rb', line 71

def allowed_tool_choices
  ["auto", "none"]
end

#available_tool_names(tools) ⇒ Object

Build the tools for the Ollama LLM



66
67
68
# File 'lib/langchain/assistant/llm/adapters/ollama.rb', line 66

def available_tool_names(tools)
  build_tools(tools).map { |tool| tool.dig(:function, :name) }
end

#build_chat_params(messages:, instructions:, tools:, tool_choice:, parallel_tool_calls:) ⇒ Hash

Build the chat parameters for the Ollama LLM

Parameters:

  • messages (Array)

    The messages

  • instructions (String)

    The system instructions

  • tools (Array)

    The tools to use

  • tool_choice (String)

    The tool choice

  • parallel_tool_calls (Boolean)

    Whether to make parallel tool calls

Returns:

  • (Hash)

    The chat parameters



16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
# File 'lib/langchain/assistant/llm/adapters/ollama.rb', line 16

def build_chat_params(
  messages:,
  instructions:,
  tools:,
  tool_choice:,
  parallel_tool_calls:
)
  Langchain.logger.warn "WARNING: `parallel_tool_calls:` is not supported by Ollama currently"
  Langchain.logger.warn "WARNING: `tool_choice:` is not supported by Ollama currently"

  params = {messages: messages}
  if tools.any?
    params[:tools] = build_tools(tools)
  end
  params
end

#build_message(role:, content: nil, image_url: nil, tool_calls: [], tool_call_id: nil) ⇒ Messages::OllamaMessage

Build an Ollama message

Parameters:

  • role (String)

    The role of the message

  • content (String) (defaults to: nil)

    The content of the message

  • image_url (String) (defaults to: nil)

    The image URL

  • tool_calls (Array) (defaults to: [])

    The tool calls

  • tool_call_id (String) (defaults to: nil)

    The tool call ID

Returns:



41
42
43
# File 'lib/langchain/assistant/llm/adapters/ollama.rb', line 41

def build_message(role:, content: nil, image_url: nil, tool_calls: [], tool_call_id: nil)
  Messages::OllamaMessage.new(role: role, content: content, image_url: image_url, tool_calls: tool_calls, tool_call_id: tool_call_id)
end

#extract_tool_call_args(tool_call:) ⇒ Array

Extract the tool call information from the OpenAI tool call hash

Parameters:

  • tool_call (Hash)

    The tool call hash

Returns:

  • (Array)

    The tool call information



49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
# File 'lib/langchain/assistant/llm/adapters/ollama.rb', line 49

def extract_tool_call_args(tool_call:)
  tool_call_id = tool_call.dig("id")

  function_name = tool_call.dig("function", "name")
  tool_name, method_name = function_name.split("__")

  tool_arguments = tool_call.dig("function", "arguments")
  tool_arguments = if tool_arguments.is_a?(Hash)
    Langchain::Utils::HashTransformer.symbolize_keys(tool_arguments)
  else
    JSON.parse(tool_arguments, symbolize_names: true)
  end

  [tool_call_id, tool_name, method_name, tool_arguments]
end

#support_system_message?Boolean

Returns:

  • (Boolean)


79
80
81
# File 'lib/langchain/assistant/llm/adapters/ollama.rb', line 79

def support_system_message?
  Messages::OllamaMessage::ROLES.include?("system")
end

#tool_roleObject



75
76
77
# File 'lib/langchain/assistant/llm/adapters/ollama.rb', line 75

def tool_role
  Messages::OllamaMessage::TOOL_ROLE
end