Method: Langchain::LLM::LlamaCpp#complete
- Defined in:
- lib/langchain/llm/llama_cpp.rb
#complete(prompt:, n_predict: 128) ⇒ String
Returns The completed prompt.
51 52 53 54 55 |
# File 'lib/langchain/llm/llama_cpp.rb', line 51 def complete(prompt:, n_predict: 128) # contexts do not appear to be stateful when it comes to completion, so re-use the same one context = completion_context ::LLaMACpp.generate(context, prompt, n_predict: n_predict) end |