The GradientLLMParams interface defines the input parameters for the GradientLLM class.

Hierarchy

Properties

adapterId?: string

Gradient Adapter ID for custom fine tuned models.

cache?: boolean | BaseCache<Generation[]>
callbackManager?: CallbackManager

⚠️ Deprecated ⚠️

Use callbacks instead

This feature is deprecated and will be removed in the future.

It is not recommended for use.

callbacks?: Callbacks
concurrency?: number

Deprecated

Use maxConcurrency instead

gradientAccessKey?: string

Gradient AI Access Token. Provide Access Token if you do not wish to automatically pull from env.

inferenceParameters?: Record<string, unknown>

Parameters accepted by the Gradient npm package.

maxConcurrency?: number

The maximum number of concurrent calls that can be made. Defaults to Infinity, which means no limit.

maxRetries?: number

The maximum number of retries that can be made for a single call, with an exponential backoff between each attempt. Defaults to 6.

metadata?: Record<string, unknown>
modelSlug?: string

Gradient AI Model Slug.

onFailedAttempt?: FailedAttemptHandler

Custom handler to handle failed attempts. Takes the originally thrown error object as input, and should itself throw an error if the input error is not retryable.

tags?: string[]
verbose?: boolean
workspaceId?: string

Gradient Workspace Id. Provide workspace id if you do not wish to automatically pull from env.

Generated using TypeDoc