Quickstart

Start analyzing with the DBNL platform immediately

This guide walks you through using the DBNL Sandbox and SDK Log Ingestion using the Python SDK to create your first project, submit log data to it, and start analyzing. See this Quickstart as a 3 min video here [link].

1

Get and install the latest DBNL SDK and Sandbox.

If you already have the SDK and a deployment of DBNL already running and accessible, just log in and skip to step 2.

pip install --upgrade dbnl
dbnl sandbox start

Log into the sandbox at http://localhost:8080 using

  • Username: admin

  • Password: password

For more deployment options see Deployment, including more detailed instructions for deploying and administering the Sandbox.

2

Create a Project

  1. Click the "+ New Project" button on the Namespace landing page

  2. Give the project a name like DBNL quickstart project

  3. Add or create a default Model Connection for the Project.

  4. Choose SDK Ingestion as the Data Connection. Creating and saving an API Token for later.

Check out the Setup docs [link] for more details about Projects [link], LLM Connections [link], Data Connections [link], and more.

3

Upload example data

You can replace the data below with your real production AI logs. Just make sure they are formatted using our DBNL Semantic Convention [link]. For more information see SDK Ingestion [link].

DBNL_API_URL = "http://localhost:8080/api"
DBNL_API_TOKEN = ""

import random
from datetime import UTC, datetime, timedelta

import dbnl
import pandas as pd

# Login to dbnl.
dbnl.login(api_url=DBNL_API_URL, api_token=DBNL_API_TOKEN)
# Use current time as reference point.
now = datetime.now(tz=UTC)
# Get or create a new project.
project = dbnl.get_or_create_project(
    name=f"quickstart-{now.isoformat()}",
    schedule="daily",
)
# Backfill first 8 days of data.
now_date = now.replace(hour=0, minute=0, second=0, microsecond=0)
start_date = now_date - timedelta(days=17)
end_date = now_date - timedelta(days=9)
for dt in pd.date_range(start_date, end_date):
    dbnl.report_run_with_results(
        project=project,
        data_start_time=dt,
        data_end_time=dt + timedelta(days=1),
        column_data=pd.DataFrame([
            {
                "timestamp": dt + timedelta(minutes=30 * i),
                "input": f"Is {i} an even or odd number?",
                "output": random.choice(["even", "odd"]),
            }
            for i in range(20)
        ]).astype({
            "timestamp": "datetime64[us, UTC]",
            "input": "string",
            "output": "category",
        }),
    )

You should see data rolling into the platform. Depending on the latency of your LLM Connection [link] it may take several minutes to ingest, enrich, analyze, and report your first insights. You can check on the status of these jobs by clicking on the "Status" button on the lower left of the project panel.

4

Discover, investigate, and track behavioral signals

  1. Go back to the DBNL project at http://localhost:8080

  2. Discover your first behavioral signals by clicking on "Insights"

  3. Investigate these insights by clicking on the "Explorer" or "Logs" button

  4. Track segments by clicking "Add Segment to Dashboard"

5

Upload more data to see new signals and see tracked signals evolve over time

# Backfill another 8 days of data.
now_date = now.replace(hour=0, minute=0, second=0, microsecond=0)
start_date = now_date - timedelta(days=8)
end_date = now_date - timedelta(days=1)
for dt in pd.date_range(start_date, end_date):
    dbnl.report_run_with_results(
        project=project,
        data_start_time=dt,
        data_end_time=dt + timedelta(days=1),
        column_data=pd.DataFrame([
            {
                "timestamp": dt + timedelta(minutes=30 * i),
                "input": f"Is {i} an even or odd number?",
                "output": random.choice(["even", "odd"]),
            }
            for i in range(20)
        ]).astype({
            "timestamp": "datetime64[us, UTC]",
            "input": "string",
            "output": "category",
        }),
    )

Go back to the DBNL project at http://localhost:8080 and repeat step 4

Next Steps

  • Create a Project with your own data using OTEL Trace, SQL, or SDK ingestion. [link]

  • Learn more about the Adaptive Analytics Workflow [link]

  • Deploy the full DBNL platform [link]

Was this helpful?