Class: PromptTemplateFullTextModel<PROMPT, MODEL_PROMPT, SETTINGS, MODEL>
Type parameters
Name | Type |
---|---|
PROMPT | PROMPT |
MODEL_PROMPT | MODEL_PROMPT |
SETTINGS | extends TextGenerationModelSettings |
MODEL | extends TextStreamingModel <MODEL_PROMPT , SETTINGS > & ToolCallGenerationModel <MODEL_PROMPT , SETTINGS > & ToolCallsGenerationModel <MODEL_PROMPT , SETTINGS > |
Hierarchy
-
PromptTemplateTextStreamingModel
<PROMPT
,MODEL_PROMPT
,SETTINGS
,MODEL
>↳
PromptTemplateFullTextModel
Implements
TextStreamingModel
<PROMPT
,SETTINGS
>ToolCallGenerationModel
<PROMPT
,SETTINGS
>ToolCallsGenerationModel
<PROMPT
,SETTINGS
>
Accessors
contextWindowSize
• get
contextWindowSize(): MODEL
["contextWindowSize"
]
Returns
MODEL
["contextWindowSize"
]
Implementation of
TextStreamingModel.contextWindowSize
Inherited from
PromptTemplateTextStreamingModel.contextWindowSize
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:49
countPromptTokens
• get
countPromptTokens(): MODEL
["countPromptTokens"
] extends undefined
? undefined
: (prompt
: PROMPT
) => PromiseLike
<number
>
Optional. Implement if you have a tokenizer and want to count the number of tokens in a prompt.
Returns
MODEL
["countPromptTokens"
] extends undefined
? undefined
: (prompt
: PROMPT
) => PromiseLike
<number
>
Implementation of
TextStreamingModel.countPromptTokens
Inherited from
PromptTemplateTextStreamingModel.countPromptTokens
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:53
modelInformation
• get
modelInformation(): ModelInformation