The GradientLLM class is used to interact with Gradient AI inference Endpoint models. This requires your Gradient AI Access Token which is autoloaded if not specified.

Hierarchy

  • LLM<BaseLLMCallOptions>
    • GradientLLM

Constructors

  • Parameters

    Returns GradientLLM

Properties

baseModel: any
modelSlug: string = "llama2-7b-chat"
gradientAccessKey?: string
inferenceParameters?: Record<string, unknown>
workspaceId?: string

Methods

  • Returns Promise<void>

Generated using TypeDoc