Namespace: prompt
llamacpp.prompt
Functions
asLlamaCppPromptTemplate
▸ asLlamaCppPromptTemplate<SOURCE_PROMPT
>(promptTemplate
): TextGenerationPromptTemplate
<SOURCE_PROMPT
, LlamaCppCompletionPrompt
>
Type parameters
Name |
---|
SOURCE_PROMPT |
Parameters
Name | Type |
---|---|
promptTemplate | TextGenerationPromptTemplate <SOURCE_PROMPT , string > |
Returns
TextGenerationPromptTemplate
<SOURCE_PROMPT
, LlamaCppCompletionPrompt
>
Defined in
packages/modelfusion/src/model-provider/llamacpp/LlamaCppPrompt.ts:14
asLlamaCppTextPromptTemplateProvider
▸ asLlamaCppTextPromptTemplateProvider(promptTemplateProvider
): TextGenerationPromptTemplateProvider
<LlamaCppCompletionPrompt
>
Parameters
Name | Type |
---|---|
promptTemplateProvider | TextGenerationPromptTemplateProvider <string > |
Returns
TextGenerationPromptTemplateProvider
<LlamaCppCompletionPrompt
>
Defined in
packages/modelfusion/src/model-provider/llamacpp/LlamaCppPrompt.ts:25
Variables
Alpaca
• Const
Alpaca: TextGenerationPromptTemplateProvider
<LlamaCppCompletionPrompt
>
Defined in
packages/modelfusion/src/model-provider/llamacpp/LlamaCppPrompt.ts:78
BakLLaVA1
• Const
BakLLaVA1: __module
= LlamaCppBakLLaVA1Prompt
Defined in
packages/modelfusion/src/model-provider/llamacpp/LlamaCppPrompt.ts:81
ChatML
• Const
ChatML: TextGenerationPromptTemplateProvider
<LlamaCppCompletionPrompt
>
Defined in
packages/modelfusion/src/model-provider/llamacpp/LlamaCppPrompt.ts:74
Llama2
• Const
Llama2: TextGenerationPromptTemplateProvider
<LlamaCppCompletionPrompt
>
Defined in
packages/modelfusion/src/model-provider/llamacpp/LlamaCppPrompt.ts:75
Mistral
• Const
Mistral: TextGenerationPromptTemplateProvider
<LlamaCppCompletionPrompt
>
Formats text, instruction or chat prompts as a Mistral instruct prompt.
Note that Mistral does not support system prompts. We emulate them.
Text prompt:
<s>[INST] { instruction } [/INST]
Instruction prompt when system prompt is set:
<s>[INST] ${ system prompt } [/INST] </s>[INST] ${instruction} [/INST] ${ response prefix }
Instruction prompt template when there is no system prompt:
<s>[INST] ${ instruction } [/INST] ${ response prefix }
Chat prompt when system prompt is set:
<s>[INST] ${ system prompt } [/INST] </s> [INST] ${ user msg 1 } [/INST] ${ model response 1 } [INST] ${ user msg 2 } [/INST] ${ model response 2 } [INST] ${ user msg 3 } [/INST]
Chat prompt when there is no system prompt:
<s>[INST] ${ user msg 1 } [/INST] ${ model response 1 } </s>[INST] ${ user msg 2 } [/INST] ${ model response 2 } [INST] ${ user msg 3 } [/INST]
See
https://docs.mistral.ai/models/#chat-template
Defined in
packages/modelfusion/src/model-provider/llamacpp/LlamaCppPrompt.ts:72
NeuralChat
• Const
NeuralChat: TextGenerationPromptTemplateProvider
<LlamaCppCompletionPrompt
>
Defined in
packages/modelfusion/src/model-provider/llamacpp/LlamaCppPrompt.ts:76
Synthia
• Const
Synthia: TextGenerationPromptTemplateProvider
<LlamaCppCompletionPrompt
>
Defined in
packages/modelfusion/src/model-provider/llamacpp/LlamaCppPrompt.ts:79
Text
• Const
Text: TextGenerationPromptTemplateProvider
<LlamaCppCompletionPrompt
>
Defined in
packages/modelfusion/src/model-provider/llamacpp/LlamaCppPrompt.ts:38
Vicuna
• Const
Vicuna: TextGenerationPromptTemplateProvider
<LlamaCppCompletionPrompt
>
Defined in
packages/modelfusion/src/model-provider/llamacpp/LlamaCppPrompt.ts:80