Quickstart

Start analyzing with the DBNL platform immediately

Explore the Product with a Read Only SaaS Account

You can start clicking around the product right away in a pre-provisioned Read Only SaaS account. This organization has pre-populated Projects that update daily that you can explore right away.

Go to app.dbnl.comarrow-up-right

Deploy a Local Sandbox with Example Data

This guide walks you through using the DBNL Sandbox and SDK Log Ingestion using the Python SDKarrow-up-right to create your first project, submit log data to it, and start analyzing. See a 3 min walkthrough in our overview videoarrow-up-right.

circle-exclamation
1

Get and install the latest DBNL SDK and Sandbox.

circle-info

If you already have the SDK and a deployment of DBNL already running and accessible, just log in and skip to step 2.

pip install --upgrade dbnl
dbnl sandbox start
dbnl sandbox logs # See spinup progress

Log into the sandbox at http://localhost:8080arrow-up-right using

  • Username: admin

  • Password: password

circle-info

For more deployment options see Deployment, including more detailed instructions for deploying and administering the Sandbox.

2

Create a Model Connection

Every DBNL Project requires a Model Connection to create LLM-as-judge metrics and perform analysis.

  1. Click on the "Model Connections" tab on the left panel of http://localhost:8080arrow-up-right

  2. Click "+ Add Model Connection"

  3. Create a Model Connection with the name: quickstart_model

3

Create a project and upload example data using the SDK

This example uses real LLM conversation logs from an "Outing Agent" application. The data is publicly available in S3.

import dbnl
import pandas as pd
import pyarrow.parquet as pq
from datetime import UTC, datetime, timedelta

# Make sure your version matches these docs
print("dbnl version:", dbnl.__version__)

# Login to DBNL
dbnl.login(
    api_url="http://localhost:8080/api",
    api_token="", # found at http://localhost:8080/tokens
)

# Create a new project
project = dbnl.get_or_create_project(
    name="Quickstart Demo",
    schedule="daily",  # How often DBNL analyzes new data
    default_llm_model_name="quickstart_model" # From step (2) above
)

# Load 14 days of real LLM conversation logs from public S3 bucket
base_path = "s3://dbnl-demo-public/outing_agent_log_data"
s3_files = [f"{base_path}/day_{i:02d}.parquet" for i in range(1, 15)]
day_dfs = [pq.read_table(f).to_pandas(types_mapper=pd.ArrowDtype, ignore_metadata=True) for f in s3_files]

# Adjust timestamps to current time so data appears recent
delta = datetime.now(tz=UTC) - day_dfs[-1]["timestamp"].max()
delta = timedelta(days=round(delta / timedelta(days=1)))
for df in day_dfs:
    df["timestamp"] = df["timestamp"] + delta

# Upload the data, DBNL needs at least 7 days to establish behavioral baselines
print("Uploading data...")
print(f"See status at: http://localhost:8080/ns/{project.namespace_id}/projects/{project.id}/status")
for idx, day_df in enumerate(day_dfs):
    print(f"{idx + 1} / {len(day_dfs)} publishing log data")
    data_start_t = min(day_df['timestamp']).replace(hour=0, minute=0, second=0, microsecond=0)
    data_end_t = data_start_t + timedelta(days=1)
    try:
        dbnl.log(
            project_id=project.id,
            data_start_time=data_start_t,
            data_end_time=data_end_t,
            data=day_df,
        )
    except Exception as e:
        if "Data already exists" in str(e):
            continue
        raise
    
print("You can now explore your data in DBNL!")
print(f"http://localhost:8080/ns/{project.namespace_id}/projects/{project.id}")
circle-info

After uploading, the data pipeline will run automatically. Depending on the latency of your Model Connection, it may take several minutes to complete all steps (Ingest → Enrich → Analyze → Publish). Check the Status page to monitor progress.

4

Discover, investigate, and track behavioral signals

After the data processing completes (check the Status page):

  1. Go back to the DBNL project at http://localhost:8080arrow-up-right

  2. Discover your first behavioral signals by clicking on "Insights"

  3. Investigate these insights by clicking on the "Explorer" or "Logs" button

  4. Track interesting patterns by clicking "Add Segment to Dashboard"

circle-info

No Insights appearing? The system needs at least 7 days of data to establish behavioral baselines. If you just uploaded data, check the Status page to ensure all pipeline steps (Ingest → Enrich → Analyze → Publish) completed successfully.

Next Steps

Was this helpful?