Skip to main content
The TypeScript SDK is not yet available for v1. If you’re using TypeScript/JavaScript, we recommend using the v0 docs: v0/quickstart.
Prerequisites: Before you begin, make sure to have an account on Basalt.

Quickstart (Python)

1

Install the SDK

pip install basalt-sdk
2

Set your API key

export BASALT_API_KEY="your_api_key_here"
3

Initialize the SDK

import os
from basalt import Basalt

basalt = Basalt(api_key=os.environ["BASALT_API_KEY"])
4

Start observing + get a prompt + call your LLM

This is the minimal “golden path”:
  1. start_observe (creates a trace), 2) get_sync (fetches the prompt), 3) your LLM call, 4) see the trace in Basalt.
import os
from basalt import Basalt
from basalt.observability import ObserveKind, observe, start_observe

basalt = Basalt(api_key=os.environ["BASALT_API_KEY"])

@observe(name="LLM call", kind=ObserveKind.GENERATION)
def call_llm(prompt_text: str) -> str:
    # Replace this with your OpenAI/Anthropic/etc. client call.
    return "..."

@start_observe(feature_slug="quickstart", name="First trace")
def run():
    prompt = basalt.prompts.get_sync(
        slug="my-prompt-slug",
        tag="production",
        variables={"customer_message": "Hello!"},
    )
    output = call_llm(prompt.text)
    observe.set_output({"output": output})
    return output

if __name__ == "__main__":
    print(run())
    basalt.shutdown()

What you get

After running the script, open the Basalt dashboard to see a trace with your root span (start_observe) and your LLM span (observe). Example trace in Basalt Next: PromptsObservabilityAPI Reference