Skip to main content

Overview

Basalt’s observability system gives you end-to-end visibility into your AI workloads—from HTTP handlers and background jobs down to prompts, LLM calls, tools, and evaluators. It is built on OpenTelemetry and centered on two primitives:
  • start_observe – creates root spans that represent entire requests or workflows
  • observe – creates child spans for nested operations (LLM calls, RAG, tools, etc.)

Major v1 changes

  • Unified observe / start_observe API for tracing, logging, and context
  • Full OpenTelemetry support with automatic context propagation (sync and async)
  • Auto-instrumentation for LLMs, vector DBs, and popular frameworks
  • First-class identity, experiments, and evaluators attached to traces
  • Consistent APIs for sync and async functions (same decorators / context managers)

Root spans with start_observe

Every trace starts with a root span created by start_observe. Use this at the entry points of your system (HTTP handlers, workers, CLI commands).
from basalt.observability import start_observe, observe

@start_observe(
    feature_slug="request-processing",
    name="process_request",
    identity={
        "user": {"id": "user_123", "name": "Alice"},
        "organization": {"id": "org_abc"},
    },
    metadata={"environment": "production", "version": "2.0"},
)
def process():
    sub_task()
    return "done"

@observe(name="sub_task")
def sub_task():
    pass
All spans created under this root automatically share identity, experiment, and context.

Nested spans with observe

Use observe to create child spans that describe meaningful units of work:
  • LLM generations
  • Retrieval / RAG
  • Tool and function execution
  • Generic business logic
from basalt.observability import observe, ObserveKind

@observe(name="Generate Answer", kind=ObserveKind.GENERATION)
def generate_answer(prompt: str) -> str:
    # Call your LLM here
    ...

@observe(name="Search Documents", kind=ObserveKind.RETRIEVAL)
def search_documents(query: str):
    # Call your vector DB or search backend
    ...
Kinds (ObserveKind.GENERATION, RETRIEVAL, TOOL, etc.) make traces easier to explore and filter in the Basalt UI.

Enriching spans

You can attach additional information to the current active span using static helpers:
  • observe.set_identity(...) – set or update user/org identity
  • observe.metadata(...) / observe.update_metadata(...) – add metadata
  • observe.set_input(...) / observe.set_output(...) – capture inputs/outputs
from basalt.observability import observe

observe.set_identity({
    "user": {"id": "user-123"},
    "organization": {"id": "org-456"},
})

observe.metadata(environment="production", feature_flag="new-search")
observe.set_input({"query": "reset my password"})
observe.set_output({"status": "ok"})

Async monitoring

The same decorators work for async functions. For advanced use, explicit async variants async_start_observe and async_observe are also available.
from basalt.observability import start_observe, observe

@start_observe(feature_slug="async-api", name="Handle Async Request")
async def handle_request_async(data):
    return await process_async(data)

@observe(name="Process Async")
async def process_async(data):
    ...
Basalt automatically propagates trace context across async boundaries, so all spans end up in the same trace.

Client Initialization

Basic initialization

The simplest way to initialize Basalt:
from basalt import Basalt
import os

basalt = Basalt(api_key=os.environ["BASALT_API_KEY"])

With observability metadata

Attach global metadata that will be added to all traces:
from basalt import Basalt

basalt = Basalt(
    api_key="your-api-key",
    observability_metadata={
        "service.name": "my-app",
        "service.version": "1.2.3",
        "deployment.environment": "production",
        "deployment.region": "us-west-2",
    },
)

With telemetry configuration

For advanced configuration of OpenTelemetry behavior and auto-instrumentation:
from basalt import Basalt, TelemetryConfig

telemetry = TelemetryConfig(
    service_name="my-service",
    environment="production",
    enable_instrumentation=True,
    trace_content=True,
    enabled_providers=["openai", "anthropic"],  # Only enable specific providers
)

basalt = Basalt(
    api_key="your-api-key",
    telemetry_config=telemetry,
)

With selective auto-instrumentation

Enable or disable auto-instrumentation providers:
from basalt import Basalt

basalt = Basalt(
    api_key="your-api-key",
    enabled_instruments=["openai", "anthropic", "chromadb"],
)

# Or disable specific providers
basalt = Basalt(
    api_key="your-api-key",
    disabled_instruments=["langchain"],  # Don't instrument LangChain
)

Shutdown

Always call shutdown() before your application exits to flush pending telemetry:
basalt.shutdown()

TelemetryConfig Reference

from basalt import TelemetryConfig

config = TelemetryConfig(
    enabled=True,                              # Enable/disable telemetry entirely
    service_name="my-service",                 # OpenTelemetry service name
    service_version="1.2.3",                   # Service version
    environment="production",                  # deployment.environment
    enable_instrumentation=True,               # Enable auto-instrumentation
    trace_content=True,                        # Include trace content in exports
    enabled_providers=["openai", "anthropic"], # Specific providers to instrument
    disabled_providers=[],                     # Providers to skip
    exporter=None,                             # Custom SpanExporter (advanced)
    extra_resource_attributes={                # Additional OTEL resource attributes
        "custom_key": "custom_value"
    },
    sample_rate=0.0,                           # Evaluator sampling rate (0.0-1.0)
)
Supported providers for enabled_providers/disabled_providers:
  • LLMs: openai, anthropic, google_generativeai, bedrock, vertexai, ollama, mistralai, together, replicate
  • Vector DBs: chromadb, pinecone, qdrant
  • Frameworks: langchain, llamaindex, haystack

Environment Variables

You can also configure Basalt using environment variables:
VariablePurposeExample
BASALT_API_KEYAPI authentication keysk-...
BASALT_TELEMETRY_ENABLEDEnable/disable telemetrytrue or false
BASALT_SERVICE_NAMEService name for tracesmy-app
BASALT_ENVIRONMENTDeployment environmentproduction
BASALT_LOG_LEVELLog level for Basalt loggersDEBUG, INFO, WARNING
BASALT_ENABLED_INSTRUMENTSComma-separated list of providers to enableopenai,anthropic
BASALT_DISABLED_INSTRUMENTSComma-separated list of providers to disablelangchain
BASALT_OTEL_EXPORTER_OTLP_ENDPOINTCustom OTLP endpointhttp://localhost:4317
BASALT_SAMPLE_RATEGlobal evaluation sample rate0.1
Use the Concepts, Patterns, and Workflows pages for deeper guidance, and the API Reference for the full Python surface area.