From 2b8c9923eb4f9c3728e80c96735d0b12022610d6 Mon Sep 17 00:00:00 2001 From: Christian Tzolov Date: Sat, 6 Apr 2024 12:28:10 +0200 Subject: [PATCH] Add docs for Ollama's template request parameter --- .../main/antora/modules/ROOT/pages/api/chat/ollama-chat.adoc | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/spring-ai-docs/src/main/antora/modules/ROOT/pages/api/chat/ollama-chat.adoc b/spring-ai-docs/src/main/antora/modules/ROOT/pages/api/chat/ollama-chat.adoc index 944d15b36e4..7d8b8c2c20b 100644 --- a/spring-ai-docs/src/main/antora/modules/ROOT/pages/api/chat/ollama-chat.adoc +++ b/spring-ai-docs/src/main/antora/modules/ROOT/pages/api/chat/ollama-chat.adoc @@ -64,7 +64,8 @@ Here are the advanced request parameter for the Ollama chat client: | spring.ai.ollama.chat.enabled | Enable Ollama chat client. | true | spring.ai.ollama.chat.options.model | The name of the https://github.com/ollama/ollama?tab=readme-ov-file#model-library[supported models] to use. | mistral | spring.ai.ollama.chat.options.format | The format to return a response in. Currently the only accepted value is `json` | - -| spring.ai.ollama.chat.options.keep_alive | controls how long the model will stay loaded into memory following the request | 5m +| spring.ai.ollama.chat.options.keep_alive | Controls how long the model will stay loaded into memory following the request | 5m +| spring.ai.ollama.chat.options.template | The prompt template to use (overrides what is defined in the Modelfile) | - |==== The `options` properties are based on the link:https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md#valid-parameters-and-values[Ollama Valid Parameters and Values] and link:https://github.com/jmorganca/ollama/blob/main/api/types.go[Ollama Types]. The default values are based on: link:https://github.com/ollama/ollama/blob/b538dc3858014f94b099730a592751a5454cab0a/api/types.go#L364[Ollama type defaults].