Class: PromptTemplateTextStreamingModel<PROMPT, MODEL_PROMPT, SETTINGS, MODEL>
Type parameters
Name | Type |
---|---|
PROMPT | PROMPT |
MODEL_PROMPT | MODEL_PROMPT |
SETTINGS | extends TextGenerationModelSettings |
MODEL | extends TextStreamingModel <MODEL_PROMPT , SETTINGS > |
Hierarchy
-
PromptTemplateTextGenerationModel
<PROMPT
,MODEL_PROMPT
,SETTINGS
,MODEL
>↳
PromptTemplateTextStreamingModel
Implements
TextStreamingModel
<PROMPT
,SETTINGS
>
Accessors
contextWindowSize
• get
contextWindowSize(): MODEL
["contextWindowSize"
]
Returns
MODEL
["contextWindowSize"
]
Implementation of
TextStreamingModel.contextWindowSize
Inherited from
PromptTemplateTextGenerationModel.contextWindowSize
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:49
countPromptTokens
• get
countPromptTokens(): MODEL
["countPromptTokens"
] extends undefined
? undefined
: (prompt
: PROMPT
) => PromiseLike
<number
>
Optional. Implement if you have a tokenizer and want to count the number of tokens in a prompt.
Returns
MODEL
["countPromptTokens"
] extends undefined
? undefined
: (prompt
: PROMPT
) => PromiseLike
<number
>
Implementation of
TextStreamingModel.countPromptTokens
Inherited from
PromptTemplateTextGenerationModel.countPromptTokens
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:53
modelInformation
• get
modelInformation(): ModelInformation
Returns
Implementation of
TextStreamingModel.modelInformation
Inherited from
PromptTemplateTextGenerationModel.modelInformation
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:37
settings
• get
settings(): SETTINGS
Returns
SETTINGS
Implementation of
Inherited from
PromptTemplateTextGenerationModel.settings
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:41
settingsForEvent
• get
settingsForEvent(): Partial
<SETTINGS
>
Returns settings that should be recorded in observability events. Security-related settings (e.g. API keys) should not be included here.
Returns
Partial
<SETTINGS
>
Implementation of
TextStreamingModel.settingsForEvent
Inherited from
PromptTemplateTextGenerationModel.settingsForEvent
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:83
tokenizer
• get
tokenizer(): MODEL
["tokenizer"
]
Returns
MODEL
["tokenizer"
]
Implementation of
Inherited from
PromptTemplateTextGenerationModel.tokenizer
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:45
Constructors
constructor
• new PromptTemplateTextStreamingModel<PROMPT
, MODEL_PROMPT
, SETTINGS
, MODEL
>(options
): PromptTemplateTextStreamingModel
<PROMPT
, MODEL_PROMPT
, SETTINGS
, MODEL
>
Type parameters
Name | Type |
---|---|
PROMPT | PROMPT |
MODEL_PROMPT | MODEL_PROMPT |
SETTINGS | extends TextGenerationModelSettings |
MODEL | extends TextStreamingModel <MODEL_PROMPT , SETTINGS > |
Parameters
Name | Type |
---|---|
options | Object |
options.model | MODEL |
options.promptTemplate | TextGenerationPromptTemplate <PROMPT , MODEL_PROMPT > |
Returns
PromptTemplateTextStreamingModel
<PROMPT
, MODEL_PROMPT
, SETTINGS
, MODEL
>
Overrides
PromptTemplateTextGenerationModel.constructor
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextStreamingModel.ts:27
Methods
asObjectGenerationModel
▸ asObjectGenerationModel<INPUT_PROMPT
>(promptTemplate
): ObjectFromTextStreamingModel
<INPUT_PROMPT
, PROMPT
, PromptTemplateTextStreamingModel
<PROMPT
, MODEL_PROMPT
, SETTINGS
, MODEL
>>
Type parameters
Name |
---|
INPUT_PROMPT |
Parameters
Name | Type |
---|---|
promptTemplate | ObjectFromTextPromptTemplate <INPUT_PROMPT , PROMPT > |
Returns
ObjectFromTextStreamingModel
<INPUT_PROMPT
, PROMPT
, PromptTemplateTextStreamingModel
<PROMPT
, MODEL_PROMPT
, SETTINGS
, MODEL
>>
Overrides
PromptTemplateTextGenerationModel.asObjectGenerationModel
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextStreamingModel.ts:43
asToolCallGenerationModel
▸ asToolCallGenerationModel<INPUT_PROMPT
>(promptTemplate
): TextGenerationToolCallModel
<INPUT_PROMPT
, PROMPT
, PromptTemplateTextStreamingModel
<PROMPT
, MODEL_PROMPT
, SETTINGS
, MODEL
>>
Type parameters
Name |
---|
INPUT_PROMPT |
Parameters
Name | Type |
---|---|
promptTemplate | ToolCallPromptTemplate <INPUT_PROMPT , PROMPT > |
Returns
TextGenerationToolCallModel
<INPUT_PROMPT
, PROMPT
, PromptTemplateTextStreamingModel
<PROMPT
, MODEL_PROMPT
, SETTINGS
, MODEL
>>
Inherited from
PromptTemplateTextGenerationModel.asToolCallGenerationModel
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:87
asToolCallsOrTextGenerationModel
▸ asToolCallsOrTextGenerationModel<INPUT_PROMPT
>(promptTemplate
): TextGenerationToolCallsModel
<INPUT_PROMPT
, PROMPT
, PromptTemplateTextStreamingModel
<PROMPT
, MODEL_PROMPT
, SETTINGS
, MODEL
>>
Type parameters
Name |
---|
INPUT_PROMPT |
Parameters
Name | Type |
---|---|
promptTemplate | ToolCallsPromptTemplate <INPUT_PROMPT , PROMPT > |
Returns
TextGenerationToolCallsModel
<INPUT_PROMPT
, PROMPT
, PromptTemplateTextStreamingModel
<PROMPT
, MODEL_PROMPT
, SETTINGS
, MODEL
>>
Inherited from
PromptTemplateTextGenerationModel.asToolCallsOrTextGenerationModel
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:96
doGenerateTexts
▸ doGenerateTexts(prompt
, options?
): PromiseLike
<{ rawResponse
: unknown
; textGenerationResults
: TextGenerationResult
[] ; usage?
: { completionTokens
: number
; promptTokens
: number
; totalTokens
: number
} }>
Parameters
Name | Type |
---|---|
prompt | PROMPT |
options? | FunctionCallOptions |
Returns
PromiseLike
<{ rawResponse
: unknown
; textGenerationResults
: TextGenerationResult
[] ; usage?
: { completionTokens
: number
; promptTokens
: number
; totalTokens
: number
} }>
Implementation of
TextStreamingModel.doGenerateTexts
Inherited from
PromptTemplateTextGenerationModel.doGenerateTexts
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:74
doStreamText
▸ doStreamText(prompt
, options?
): PromiseLike
<AsyncIterable
<Delta
<unknown
>>>
Parameters
Name | Type |
---|---|
prompt | PROMPT |
options? | FunctionCallOptions |
Returns
PromiseLike
<AsyncIterable
<Delta
<unknown
>>>
Implementation of
TextStreamingModel.doStreamText
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextStreamingModel.ts:34
extractTextDelta
▸ extractTextDelta(delta
): undefined
| string
Parameters
Name | Type |
---|---|
delta | unknown |
Returns
undefined
| string
Implementation of
TextStreamingModel.extractTextDelta
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextStreamingModel.ts:39
restoreGeneratedTexts
▸ restoreGeneratedTexts(rawResponse
): Object
Parameters
Name | Type |
---|---|
rawResponse | unknown |
Returns
Object
Name | Type |
---|---|
rawResponse | unknown |
textGenerationResults | TextGenerationResult [] |
usage? | { completionTokens : number ; promptTokens : number ; totalTokens : number } |
usage.completionTokens | number |
usage.promptTokens | number |
usage.totalTokens | number |
Implementation of
TextStreamingModel.restoreGeneratedTexts
Inherited from
PromptTemplateTextGenerationModel.restoreGeneratedTexts
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:79
withJsonOutput
▸ withJsonOutput(schema
): this
When possible, limit the output generation to the specified JSON schema, or super sets of it (e.g. JSON in general).
Parameters
Name | Type |
---|---|
schema | Schema <unknown > & JsonSchemaProducer |
Returns
this
Implementation of
TextStreamingModel.withJsonOutput
Overrides
PromptTemplateTextGenerationModel.withJsonOutput
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextStreamingModel.ts:52
withSettings
▸ withSettings(additionalSettings
): this
The withSettings
method creates a new model with the same configuration as the original model, but with the specified settings changed.
Parameters
Name | Type |
---|---|
additionalSettings | Partial <SETTINGS > |
Returns
this
Example
const model = new OpenAICompletionModel({
model: "gpt-3.5-turbo-instruct",
maxGenerationTokens: 500,
});
const modelWithMoreTokens = model.withSettings({
maxGenerationTokens: 1000,
});
Implementation of
TextStreamingModel.withSettings
Overrides
PromptTemplateTextGenerationModel.withSettings
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextStreamingModel.ts:59
Properties
model
• Readonly
model: MODEL
Inherited from
PromptTemplateTextGenerationModel.model
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:23
promptTemplate
• Readonly
promptTemplate: TextGenerationPromptTemplate
<PROMPT
, MODEL_PROMPT
>
Inherited from
PromptTemplateTextGenerationModel.promptTemplate
Defined in
packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:24