Class: PromptTemplateTextGenerationModel<PROMPT, MODEL_PROMPT, SETTINGS, MODEL>
Type parameters
Name | Type |
---|---|
PROMPT | PROMPT |
MODEL_PROMPT | MODEL_PROMPT |
SETTINGS | extends TextGenerationModelSettings |
MODEL | extends TextGenerationModel <MODEL_PROMPT , SETTINGS > |
Hierarchy
-
PromptTemplateTextGenerationModel
Implements
TextGenerationModel
<PROMPT
,SETTINGS
>
Accessors
contextWindowSize
• get
contextWindowSize(): MODEL
["contextWindowSize"
]
Returns
MODEL
["contextWindowSize"
]
Implementation of
TextGenerationModel.contextWindowSize
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:49
countPromptTokens
• get
countPromptTokens(): MODEL
["countPromptTokens"
] extends undefined
? undefined
: (prompt
: PROMPT
) => PromiseLike
<number
>
Optional. Implement if you have a tokenizer and want to count the number of tokens in a prompt.
Returns
MODEL
["countPromptTokens"
] extends undefined
? undefined
: (prompt
: PROMPT
) => PromiseLike
<number
>
Implementation of
TextGenerationModel.countPromptTokens
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:53
modelInformation
• get
modelInformation(): ModelInformation
Returns
Implementation of
TextGenerationModel.modelInformation
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:37
settings
• get
settings(): SETTINGS
Returns
SETTINGS
Implementation of
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:41
settingsForEvent
• get
settingsForEvent(): Partial
<SETTINGS
>
Returns settings that should be recorded in observability events. Security-related settings (e.g. API keys) should not be included here.
Returns
Partial
<SETTINGS
>
Implementation of
TextGenerationModel.settingsForEvent
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:83
tokenizer
• get
tokenizer(): MODEL
["tokenizer"
]
Returns
MODEL
["tokenizer"
]
Implementation of
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:45
Constructors
constructor
• new PromptTemplateTextGenerationModel<PROMPT
, MODEL_PROMPT
, SETTINGS
, MODEL
>(«destructured»
): PromptTemplateTextGenerationModel
<PROMPT
, MODEL_PROMPT
, SETTINGS
, MODEL
>
Type parameters
Name | Type |
---|---|
PROMPT | PROMPT |
MODEL_PROMPT | MODEL_PROMPT |
SETTINGS | extends TextGenerationModelSettings |
MODEL | extends TextGenerationModel <MODEL_PROMPT , SETTINGS > |
Parameters
Name | Type |
---|---|
«destructured» | Object |
› model | MODEL |
› promptTemplate | TextGenerationPromptTemplate <PROMPT , MODEL_PROMPT > |
Returns
PromptTemplateTextGenerationModel
<PROMPT
, MODEL_PROMPT
, SETTINGS
, MODEL
>
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:26
Methods
asObjectGenerationModel
▸ asObjectGenerationModel<INPUT_PROMPT
>(promptTemplate
): ObjectFromTextGenerationModel
<INPUT_PROMPT
, PROMPT
, PromptTemplateTextGenerationModel
<PROMPT
, MODEL_PROMPT
, SETTINGS
, MODEL
>>
Type parameters
Name |
---|
INPUT_PROMPT |
Parameters
Name | Type |
---|---|
promptTemplate | ObjectFromTextPromptTemplate <INPUT_PROMPT , PROMPT > |
Returns
ObjectFromTextGenerationModel
<INPUT_PROMPT
, PROMPT
, PromptTemplateTextGenerationModel
<PROMPT
, MODEL_PROMPT
, SETTINGS
, MODEL
>>
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:105
asToolCallGenerationModel
▸ asToolCallGenerationModel<INPUT_PROMPT
>(promptTemplate
): TextGenerationToolCallModel
<INPUT_PROMPT
, PROMPT
, PromptTemplateTextGenerationModel
<PROMPT
, MODEL_PROMPT
, SETTINGS
, MODEL
>>
Type parameters
Name |
---|
INPUT_PROMPT |
Parameters
Name | Type |
---|---|
promptTemplate | ToolCallPromptTemplate <INPUT_PROMPT , PROMPT > |
Returns
TextGenerationToolCallModel
<INPUT_PROMPT
, PROMPT
, PromptTemplateTextGenerationModel
<PROMPT
, MODEL_PROMPT
, SETTINGS
, MODEL
>>
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:87
asToolCallsOrTextGenerationModel
▸ asToolCallsOrTextGenerationModel<INPUT_PROMPT
>(promptTemplate
): TextGenerationToolCallsModel
<INPUT_PROMPT
, PROMPT
, PromptTemplateTextGenerationModel
<PROMPT
, MODEL_PROMPT
, SETTINGS
, MODEL
>>
Type parameters
Name |
---|
INPUT_PROMPT |
Parameters
Name | Type |
---|---|
promptTemplate | ToolCallsPromptTemplate <INPUT_PROMPT , PROMPT > |
Returns
TextGenerationToolCallsModel
<INPUT_PROMPT
, PROMPT
, PromptTemplateTextGenerationModel
<PROMPT
, MODEL_PROMPT
, SETTINGS
, MODEL
>>
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:96
doGenerateTexts
▸ doGenerateTexts(prompt
, options?
): PromiseLike
<{ rawResponse
: unknown
; textGenerationResults
: TextGenerationResult
[] ; usage?
: { completionTokens
: number
; promptTokens
: number
; totalTokens
: number
} }>
Parameters
Name | Type |
---|---|
prompt | PROMPT |
options? | FunctionCallOptions |
Returns
PromiseLike
<{ rawResponse
: unknown
; textGenerationResults
: TextGenerationResult
[] ; usage?
: { completionTokens
: number
; promptTokens
: number
; totalTokens
: number
} }>
Implementation of
TextGenerationModel.doGenerateTexts
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:74
restoreGeneratedTexts
▸ restoreGeneratedTexts(rawResponse
): Object
Parameters
Name | Type |
---|---|
rawResponse | unknown |
Returns
Object
Name | Type |
---|---|
rawResponse | unknown |
textGenerationResults | TextGenerationResult [] |
usage? | { completionTokens : number ; promptTokens : number ; totalTokens : number } |
usage.completionTokens | number |
usage.promptTokens | number |
usage.totalTokens | number |
Implementation of
TextGenerationModel.restoreGeneratedTexts
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:79
withJsonOutput
▸ withJsonOutput(schema
): this
When possible, limit the output generation to the specified JSON schema, or super sets of it (e.g. JSON in general).
Parameters
Name | Type |
---|---|
schema | Schema <unknown > & JsonSchemaProducer |
Returns
this
Implementation of
TextGenerationModel.withJsonOutput
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:114
withSettings
▸ withSettings(additionalSettings
): this
The withSettings
method creates a new model with the same configuration as the original model, but with the specified settings changed.
Parameters
Name | Type |
---|---|
additionalSettings | Partial <SETTINGS > |
Returns
this
Example
const model = new OpenAICompletionModel({
model: "gpt-3.5-turbo-instruct",
maxGenerationTokens: 500,
});
const modelWithMoreTokens = model.withSettings({
maxGenerationTokens: 1000,
});
Implementation of
TextGenerationModel.withSettings
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:121
Properties
model
• Readonly
model: MODEL
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:23
promptTemplate
• Readonly
promptTemplate: TextGenerationPromptTemplate
<PROMPT
, MODEL_PROMPT
>
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:24