Prompts

Prompts are a fundamental part of the system, allowing you to maintain and version control your prompt. On this page, we'll explore how to retrieve prompts and their specific versions programmatically. We'll look at how to fetch prompts using their unique identifiers and how version resolution works.

The prompt model

The prompt model contains all the information about your prompts, including their content, versions, and model configurations. Each prompt can have multiple versions and can be tagged for different environments or purposes.

Properties

  • Name
    slug
    Type
    string
    Description

    Unique identifier for the prompt.

  • Name
    text
    Type
    string
    Description

    The actual content of the prompt version.

  • Name
    model
    Type
    object
    Description

    Configuration for the LLM model.

    {
      provider: 'anthropic' | 'open-ai' | 'mistral' | 'gemini';
      model: string;
      version: string | 'latest';
      parameters: {
        temperature: number;
        topP: number;
        frequencyPenalty?: number;
        presencePenalty?: number;
        topK?: number;
        maxLength: number;
        responseFormat: ResponseFormat;
        jsonObject?: Record<string, any>;
      };
    }
    
  • Name
    version
    Type
    string
    Description

    Version identifier for the prompt (e.g., "1.0").

  • Name
    tag
    Type
    string
    Description

    Optional tag for the prompt version (e.g., "production", "staging"). The "latest" tag is always available. The "production" is linked to the version you deployed in the app.

Supported Models

The system supports various LLM providers and their respective models:

Anthropic

  • 3.5-sonnet
  • 3-sonnet
  • 3-haiku

OpenAI

  • gpt-4o
  • gpt-4o-mini
  • gpt-3.5-turbo
  • o1-preview
  • o1-mini

Mistral

  • mistral-large
  • mistral-8x7B
  • mistral-7b

Gemini

  • gemini-1.5-flash
  • gemini-1.5-flash-8b
  • gemini-1.5-pro

GET/prompts/:slug

Retrieve a prompt

This endpoint allows you to retrieve a specific prompt by its slug. You can optionally specify a version or tag to get a particular variant of the prompt.

Optional parameters

  • Name
    version
    Type
    string
    Description

    Specific version of the prompt to retrieve.

  • Name
    tag
    Type
    string
    Description

    Tag identifier to fetch a specific prompt version.

Version resolution

When retrieving a prompt, the system follows this hierarchy:

  1. Uses the specified version if provided (ignores tag if both are provided)
  2. Uses the specified tag if provided
  3. Falls back to the "production" tagged version if neither version nor tag matches
  4. Returns 404 if no matching version is found, including production version

Workspace context

All requests must include workspace authentication. The endpoint will only return prompts that belong to the authenticated workspace's scope.

Request

GET
/prompts/customer-support
curl -G https://api.getbasalt.ai/prompts/customer-support \
  -H "Authorization: Bearer {token}" \
  -d version=1.0

Response 200

{
  "prompt": {
    "text": "You are a helpful customer support agent. Help the customer with their inquiry: {{inquiry}}",
    "systemText": "You are a helpful customer support agent.",
    "model": {
      "provider": "anthropic",
      "model": "3.5-sonnet",
      "version": "latest",
      "parameters": {
        "temperature": 0.7,
        "topP": 0.95,
        "maxLength": 1000,
        "responseFormat": "text"
      }
    }
  },
  "warning": "Both version and tag were provided. Ignoring the tag."
}

Response 404 (Prompt not found)

{
  "error": "No prompt with slug \"customer-support\" exists."
}

Response 404 (Version not found)

{
  "error": "No prompt found at the specified version or tag. The production version is also missing. Please check on the app."
}

Response 404 (No production version)

{
  "error": "No production version found. Please check on the app."
}

Using versions and tags

When working with prompts, you can use either versions or tags to get specific variants of your prompts. Versions are typically used for tracking specific iterations (like "1.0.0"), while tags are used for environment management (like "production" or "staging").

Version and tag priority

  • If both version and tag are provided, version takes precedence
  • If neither is provided, the system attempts to find the "production" tagged version
  • The tag filtering system is being enhanced to support more complex scenarios

Request examples

GET
/prompts/welcome-email
curl -G https://api.getbasalt.ai/prompts/welcome-email \
  -H "Authorization: Bearer {token}" \
  -d version=1.0.0

Response warnings

The API may return warning messages in certain scenarios while still successfully returning the prompt content. These warnings help you understand when fallbacks or parameter precedence rules have been applied.

Possible warnings

  • When both version and tag are provided, the version takes precedence and the tag is ignored
  • When a specified version is not found, the system falls back to the production version
  • When a specified tag is not found, the system falls back to the production version

The warning field in the response will be omitted if no warning conditions are triggered.

Response examples

GET
/prompts/customer-support
{
  "prompt": {
    "text": "You are a helpful customer support agent...",
    "model": {
      "provider": "anthropic",
      "model": "3.5-sonnet",
      "version": "latest",
      "parameters": {
        "temperature": 0.7,
        "topP": 0.95,
        "maxLength": 1000,
        "responseFormat": "text"
      }
    }
  },
  "warning": "Both version and tag were provided. Ignoring the tag."
}

Was this page helpful?