Namespace: prompt
ollama.prompt
Functions
asOllamaCompletionPromptTemplate
▸ asOllamaCompletionPromptTemplate<SOURCE_PROMPT
>(promptTemplate
): TextGenerationPromptTemplate
<SOURCE_PROMPT
, OllamaCompletionPrompt
>
Type parameters
Name |
---|
SOURCE_PROMPT |
Parameters
Name | Type |
---|---|
promptTemplate | TextGenerationPromptTemplate <SOURCE_PROMPT , string > |
Returns
TextGenerationPromptTemplate
<SOURCE_PROMPT
, OllamaCompletionPrompt
>
Defined in
packages/modelfusion/src/model-provider/ollama/OllamaCompletionPrompt.ts:13
asOllamaCompletionTextPromptTemplateProvider
▸ asOllamaCompletionTextPromptTemplateProvider(promptTemplateProvider
): TextGenerationPromptTemplateProvider
<OllamaCompletionPrompt
>
Parameters
Name | Type |
---|---|
promptTemplateProvider | TextGenerationPromptTemplateProvider <string > |
Returns
TextGenerationPromptTemplateProvider
<OllamaCompletionPrompt
>
Defined in
packages/modelfusion/src/model-provider/ollama/OllamaCompletionPrompt.ts:24
Variables
Alpaca
• Const
Alpaca: TextGenerationPromptTemplateProvider
<OllamaCompletionPrompt
>
Defined in
packages/modelfusion/src/model-provider/ollama/OllamaCompletionPrompt.ts:80
ChatML
• Const
ChatML: TextGenerationPromptTemplateProvider
<OllamaCompletionPrompt
>
Defined in
packages/modelfusion/src/model-provider/ollama/OllamaCompletionPrompt.ts:74
Llama2
• Const
Llama2: TextGenerationPromptTemplateProvider
<OllamaCompletionPrompt
>
Defined in
packages/modelfusion/src/model-provider/ollama/OllamaCompletionPrompt.ts:76
Mistral
• Const
Mistral: TextGenerationPromptTemplateProvider
<OllamaCompletionPrompt
>
Formats text, instruction or chat prompts as a Mistral instruct prompt.
Note that Mistral does not support system prompts. We emulate them.
Text prompt:
<s>[INST] { instruction } [/INST]
Instruction prompt when system prompt is set:
<s>[INST] ${ system prompt } [/INST] </s>[INST] ${instruction} [/INST] ${ response prefix }
Instruction prompt template when there is no system prompt:
<s>[INST] ${ instruction } [/INST] ${ response prefix }
Chat prompt when system prompt is set:
<s>[INST] ${ system prompt } [/INST] </s> [INST] ${ user msg 1 } [/INST] ${ model response 1 } [INST] ${ user msg 2 } [/INST] ${ model response 2 } [INST] ${ user msg 3 } [/INST]
Chat prompt when there is no system prompt:
<s>[INST] ${ user msg 1 } [/INST] ${ model response 1 } </s>[INST] ${ user msg 2 } [/INST] ${ model response 2 } [INST] ${ user msg 3 } [/INST]
See
https://docs.mistral.ai/models/#chat-template
Defined in
packages/modelfusion/src/model-provider/ollama/OllamaCompletionPrompt.ts:71
NeuralChat
• Const
NeuralChat: TextGenerationPromptTemplateProvider
<OllamaCompletionPrompt
>
Defined in
packages/modelfusion/src/model-provider/ollama/OllamaCompletionPrompt.ts:78
Synthia
• Const
Synthia: TextGenerationPromptTemplateProvider
<OllamaCompletionPrompt
>
Defined in
packages/modelfusion/src/model-provider/ollama/OllamaCompletionPrompt.ts:82
Text
• Const
Text: TextGenerationPromptTemplateProvider
<OllamaCompletionPrompt
>
Defined in
packages/modelfusion/src/model-provider/ollama/OllamaCompletionPrompt.ts:37
Vicuna
• Const
Vicuna: TextGenerationPromptTemplateProvider
<OllamaCompletionPrompt
>
Defined in
packages/modelfusion/src/model-provider/ollama/OllamaCompletionPrompt.ts:84