Ollama Chat Model node documentation
Learn how to use the Ollama Chat Model node in n8n. Follow technical documentation to integrate Ollama Chat Model node into your workflows.
Ollama Chat Model node
The Ollama Chat Model node allows you use local Llama 2 models with conversational agents.
On this page, you'll find the node parameters for the Ollama Chat Model node, and links to more resources.
Credentials
You can find authentication information for this node here.
Parameter resolution in sub-nodes
Sub-nodes behave differently to other nodes when processing multiple items using an expression.
Most nodes, including root nodes, take any number of items as input, process these items, and output the results. You can use expressions to refer to input items, and the node resolves the expression for each item in turn. For example, given an input of five values, the expression resolves to each name in turn.
