Interface defining the parameters for configuring the Hugging Face model for text generation.

interface HFInput {
    model: string;
    apiKey?: string;
    endpointUrl?: string;
    frequencyPenalty?: number;
    includeCredentials?: string | boolean;
    maxTokens?: number;
    stopSequences?: string[];
    temperature?: number;
    topK?: number;
    topP?: number;
}

Implemented by

Properties

model: string

Model to use

apiKey?: string

API key to use.

endpointUrl?: string

Custom inference endpoint URL to use

frequencyPenalty?: number

Penalizes repeated tokens according to frequency

includeCredentials?: string | boolean

Credentials to use for the request. If this is a string, it will be passed straight on. If it's a boolean, true will be "include" and false will not send credentials at all.

maxTokens?: number

Maximum number of tokens to generate in the completion.

stopSequences?: string[]

The model will stop generating text when one of the strings in the list is generated.

temperature?: number

Sampling temperature to use

topK?: number

Integer to define the top tokens considered within the sample operation to create new text.

topP?: number

Total probability mass of tokens to consider at each step

Generated using TypeDoc