Using Prompts in Python
The Basalt Python SDK gives you a simple, high-level interface to work with prompts from your Python code.You can list prompts, fetch specific versions or tags, inspect metadata, and control which prompt version is used in each environment. Typical use cases include:
- Connecting your app to centrally managed prompts
- Rolling out new prompt versions safely via tags
- Injecting runtime data into prompts with variables
- Keeping model configuration close to prompt content
Initialization
Initialize the client once and reuse it across your application.Core Methods
All operations are available in both synchronous and asynchronous forms.-
basalt.prompts.list_sync()/basalt.prompts.list_async():
List all prompts accessible to your API key, with their basic metadata (slug, description, latest version, tags). -
basalt.prompts.get_sync(...)/basalt.prompts.get(...):
Retrieve a specific prompt’s rendered text and model configuration, using either atag(e.g.production) or a specificversion. -
basalt.prompts.describe_sync(slug):
Get detailed metadata for a single prompt, including all available versions and tags, without fetching the full prompt text. Async variants may also be available; refer to the Python SDK reference. -
basalt.prompt.publish_sync(...)/basalt.prompt.publish(...):
Publish a version to a tag (for example, mark version1.3.0asproduction).
Prompt Object
When you retrieve a prompt, you receive a structured object containing:slug: Unique identifier for the promptversion: The concrete version number being used (e.g."1.2.0")text: The final prompt text (after variable substitution, if provided)description: Human-readable description of the prompt’s purposemodel: Model configuration object:provider: e.g."openai","anthropic"model: e.g."gpt-4.1","claude-3-opus"parameters: e.g.temperature,max_tokens,top_p
tags: List of tags currently pointing to this version (e.g.["latest", "staging"])
prompt.text and prompt.model directly into your LLM client of choice.
Variable Substitution
Prompts can contain variables using Jinja2 syntax:get_sync / get, pass a variables dictionary:
variables, the prompt is returned with raw placeholders, which is useful for debugging or previewing templates.
Caching
The SDK caches prompt resolutions for a short time to reduce latency and API calls. No configuration is required for most apps.Error Handling
The SDK raises specific exceptions so you can react appropriately:NotFoundError: The prompt, tag, or version doesn’t exist.UnauthorizedError: Invalid or missing API key.NetworkError: Network connectivity issues when calling the Basalt API.BasaltAPIError: Other API-level errors (validation, server issues, etc.).
try/except blocks and handle these errors based on your application’s needs (e.g. fallback to a default prompt, log and return a safe message, etc.).