Class: Vellum::RegisteredPromptsClient

Inherits:
Object
  • Object
show all
Defined in:
lib/vellum_ai/registered_prompts/client.rb

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(request_client:) ⇒ RegisteredPromptsClient

Parameters:



16
17
18
19
# File 'lib/vellum_ai/registered_prompts/client.rb', line 16

def initialize(request_client:)
  # @type [RequestClient]
  @request_client = request_client
end

Instance Attribute Details

#request_clientObject (readonly)

Returns the value of attribute request_client.



12
13
14
# File 'lib/vellum_ai/registered_prompts/client.rb', line 12

def request_client
  @request_client
end

Instance Method Details

#register_prompt(label:, name:, prompt:, model:, parameters:, provider: nil, meta: nil, request_options: nil) ⇒ RegisterPromptResponse

Registers a prompt within Vellum and creates associated Vellum entities. Intended to be used by integration partners, not directly by Vellum users.

Under the hood, this endpoint creates a new sandbox, a new model version, and a new deployment.

Parameters:

  • label (String)

    A human-friendly label for corresponding entities created in Vellum.

  • name (String)

    A uniquely-identifying name for corresponding entities created in Vellum.

  • prompt (Hash)

    Information about how to execute the prompt template.Request of type RegisterPromptPromptInfoRequest, as a Hash

    • :prompt_block_data (Hash)

      • :version (Integer)

      • :blocks (Array<PromptTemplateBlockRequest>)

    • :input_variables (Array<RegisteredPromptInputVariableRequest>)

  • provider (PROVIDER_ENUM) (defaults to: nil)

    The initial LLM provider to use for this prompt

    • ‘ANTHROPIC` - Anthropic

    • ‘AWS_BEDROCK` - AWS Bedrock

    • ‘AZURE_OPENAI` - Azure OpenAI

    • ‘COHERE` - Cohere

    • ‘GOOGLE` - Google

    • ‘HOSTED` - Hosted

    • ‘MOSAICML` - MosaicML

    • ‘OPENAI` - OpenAI

    • ‘FIREWORKS_AI` - Fireworks AI

    • ‘HUGGINGFACE` - HuggingFace

    • ‘MYSTIC` - Mystic

    • ‘PYQ` - Pyq

    • ‘REPLICATE` - Replicate

  • model (String)

    The initial model to use for this prompt

  • parameters (Hash)

    The initial model parameters to use for this promptRequest of type RegisterPromptModelParametersRequest, as a Hash

    • :temperature (Float)

    • :max_tokens (Integer)

    • :stop (Array<String>)

    • :top_p (Float)

    • :top_k (Integer)

    • :frequency_penalty (Float)

    • :presence_penalty (Float)

    • :logit_bias (Hash=> String)

    • :custom_parameters (Hash=> String)

  • meta (Hash{String => String}) (defaults to: nil)

    Optionally include additional metadata to store along with the prompt.

  • request_options (RequestOptions) (defaults to: nil)

Returns:



61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
# File 'lib/vellum_ai/registered_prompts/client.rb', line 61

def register_prompt(label:, name:, prompt:, model:, parameters:, provider: nil, meta: nil, request_options: nil)
  response = @request_client.conn.post do |req|
    req.options.timeout = request_options.timeout_in_seconds unless request_options&.timeout_in_seconds.nil?
    req.headers["X_API_KEY"] = request_options.api_key unless request_options&.api_key.nil?
    req.headers = { **req.headers, **(request_options&.additional_headers || {}) }.compact
    req.body = {
      **(request_options&.additional_body_parameters || {}),
      label: label,
      name: name,
      prompt: prompt,
      provider: provider,
      model: model,
      parameters: parameters,
      meta: meta
    }.compact
    req.url "#{@request_client.default_environment[:Default]}/v1/registered-prompts/register"
  end
  RegisterPromptResponse.from_json(json_object: response.body)
end