Skip to main content

Class: PromptTemplateTextGenerationModel<PROMPT, MODEL_PROMPT, SETTINGS, MODEL>

Type parameters

NameType
PROMPTPROMPT
MODEL_PROMPTMODEL_PROMPT
SETTINGSextends TextGenerationModelSettings
MODELextends TextGenerationModel<MODEL_PROMPT, SETTINGS>

Hierarchy

Implements

Accessors

contextWindowSize

get contextWindowSize(): MODEL["contextWindowSize"]

Returns

MODEL["contextWindowSize"]

Implementation of

TextGenerationModel.contextWindowSize

Defined in

packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:49


countPromptTokens

get countPromptTokens(): MODEL["countPromptTokens"] extends undefined ? undefined : (prompt: PROMPT) => PromiseLike<number>

Optional. Implement if you have a tokenizer and want to count the number of tokens in a prompt.

Returns

MODEL["countPromptTokens"] extends undefined ? undefined : (prompt: PROMPT) => PromiseLike<number>

Implementation of

TextGenerationModel.countPromptTokens

Defined in

packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:53


modelInformation

get modelInformation(): ModelInformation

Returns

ModelInformation

Implementation of

TextGenerationModel.modelInformation

Defined in

packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:37


settings

get settings(): SETTINGS

Returns

SETTINGS

Implementation of

TextGenerationModel.settings

Defined in

packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:41


settingsForEvent

get settingsForEvent(): Partial<SETTINGS>

Returns settings that should be recorded in observability events. Security-related settings (e.g. API keys) should not be included here.

Returns

Partial<SETTINGS>

Implementation of

TextGenerationModel.settingsForEvent

Defined in

packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:83


tokenizer

get tokenizer(): MODEL["tokenizer"]

Returns

MODEL["tokenizer"]

Implementation of

TextGenerationModel.tokenizer

Defined in

packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:45

Constructors

constructor

new PromptTemplateTextGenerationModel<PROMPT, MODEL_PROMPT, SETTINGS, MODEL>(«destructured»): PromptTemplateTextGenerationModel<PROMPT, MODEL_PROMPT, SETTINGS, MODEL>

Type parameters

NameType
PROMPTPROMPT
MODEL_PROMPTMODEL_PROMPT
SETTINGSextends TextGenerationModelSettings
MODELextends TextGenerationModel<MODEL_PROMPT, SETTINGS>

Parameters

NameType
«destructured»Object
› modelMODEL
› promptTemplateTextGenerationPromptTemplate<PROMPT, MODEL_PROMPT>

Returns

PromptTemplateTextGenerationModel<PROMPT, MODEL_PROMPT, SETTINGS, MODEL>

Defined in

packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:26

Methods

asObjectGenerationModel

asObjectGenerationModel<INPUT_PROMPT>(promptTemplate): ObjectFromTextGenerationModel<INPUT_PROMPT, PROMPT, PromptTemplateTextGenerationModel<PROMPT, MODEL_PROMPT, SETTINGS, MODEL>>

Type parameters

Name
INPUT_PROMPT

Parameters

NameType
promptTemplateObjectFromTextPromptTemplate<INPUT_PROMPT, PROMPT>

Returns

ObjectFromTextGenerationModel<INPUT_PROMPT, PROMPT, PromptTemplateTextGenerationModel<PROMPT, MODEL_PROMPT, SETTINGS, MODEL>>

Defined in

packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:105


asToolCallGenerationModel

asToolCallGenerationModel<INPUT_PROMPT>(promptTemplate): TextGenerationToolCallModel<INPUT_PROMPT, PROMPT, PromptTemplateTextGenerationModel<PROMPT, MODEL_PROMPT, SETTINGS, MODEL>>

Type parameters

Name
INPUT_PROMPT

Parameters

NameType
promptTemplateToolCallPromptTemplate<INPUT_PROMPT, PROMPT>

Returns

TextGenerationToolCallModel<INPUT_PROMPT, PROMPT, PromptTemplateTextGenerationModel<PROMPT, MODEL_PROMPT, SETTINGS, MODEL>>

Defined in

packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:87


asToolCallsOrTextGenerationModel

asToolCallsOrTextGenerationModel<INPUT_PROMPT>(promptTemplate): TextGenerationToolCallsModel<INPUT_PROMPT, PROMPT, PromptTemplateTextGenerationModel<PROMPT, MODEL_PROMPT, SETTINGS, MODEL>>

Type parameters

Name
INPUT_PROMPT

Parameters

NameType
promptTemplateToolCallsPromptTemplate<INPUT_PROMPT, PROMPT>

Returns

TextGenerationToolCallsModel<INPUT_PROMPT, PROMPT, PromptTemplateTextGenerationModel<PROMPT, MODEL_PROMPT, SETTINGS, MODEL>>

Defined in

packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:96


doGenerateTexts

doGenerateTexts(prompt, options?): PromiseLike<{ rawResponse: unknown ; textGenerationResults: TextGenerationResult[] ; usage?: { completionTokens: number ; promptTokens: number ; totalTokens: number } }>

Parameters

NameType
promptPROMPT
options?FunctionCallOptions

Returns

PromiseLike<{ rawResponse: unknown ; textGenerationResults: TextGenerationResult[] ; usage?: { completionTokens: number ; promptTokens: number ; totalTokens: number } }>

Implementation of

TextGenerationModel.doGenerateTexts

Defined in

packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:74


restoreGeneratedTexts

restoreGeneratedTexts(rawResponse): Object

Parameters

NameType
rawResponseunknown

Returns

Object

NameType
rawResponseunknown
textGenerationResultsTextGenerationResult[]
usage?{ completionTokens: number ; promptTokens: number ; totalTokens: number }
usage.completionTokensnumber
usage.promptTokensnumber
usage.totalTokensnumber

Implementation of

TextGenerationModel.restoreGeneratedTexts

Defined in

packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:79


withJsonOutput

withJsonOutput(schema): this

When possible, limit the output generation to the specified JSON schema, or super sets of it (e.g. JSON in general).

Parameters

NameType
schemaSchema<unknown> & JsonSchemaProducer

Returns

this

Implementation of

TextGenerationModel.withJsonOutput

Defined in

packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:114


withSettings

withSettings(additionalSettings): this

The withSettings method creates a new model with the same configuration as the original model, but with the specified settings changed.

Parameters

NameType
additionalSettingsPartial<SETTINGS>

Returns

this

Example

const model = new OpenAICompletionModel({
model: "gpt-3.5-turbo-instruct",
maxGenerationTokens: 500,
});

const modelWithMoreTokens = model.withSettings({
maxGenerationTokens: 1000,
});

Implementation of

TextGenerationModel.withSettings

Defined in

packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:121

Properties

model

Readonly model: MODEL

Defined in

packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:23


promptTemplate

Readonly promptTemplate: TextGenerationPromptTemplate<PROMPT, MODEL_PROMPT>

Defined in

packages/modelfusion/src/model-function/generate-text/PromptTemplateTextGenerationModel.ts:24