Documentation Index
Fetch the complete documentation index at: https://docs.getbasalt.ai/llms.txt
Use this file to discover all available pages before exploring further.
The TypeScript SDK is not yet available for v1. If you’re using TypeScript/JavaScript, we recommend using the v0 docs: v0/quickstart.
Quickstart (Python)
Set your API key
export BASALT_API_KEY="your_api_key_here"
Initialize the SDK
import os
from basalt import Basalt
basalt = Basalt(api_key=os.environ["BASALT_API_KEY"])
Start observing + get a prompt + call your LLM
This is the minimal “golden path”:
start_observe (creates a trace), 2) get_sync (fetches the prompt), 3) your LLM call, 4) see the trace in Basalt.
import os
from basalt import Basalt
from basalt.observability import ObserveKind, observe, start_observe
basalt = Basalt(api_key=os.environ["BASALT_API_KEY"])
@observe(name="LLM call", kind=ObserveKind.GENERATION)
def call_llm(prompt_text: str) -> str:
# Replace this with your OpenAI/Anthropic/etc. client call.
return "..."
@start_observe(feature_slug="quickstart", name="First trace")
def run():
prompt = basalt.prompts.get_sync(
slug="my-prompt-slug",
tag="production",
variables={"customer_message": "Hello!"},
)
output = call_llm(prompt.text)
observe.set_output({"output": output})
return output
if __name__ == "__main__":
print(run())
basalt.shutdown()
What you get
After running the script, open the Basalt dashboard to see a trace with your root span (start_observe) and your LLM span (observe).
Next: Prompts • Observability • API Reference