Quickstart
Start analyzing with the DBNL platform immediately
Determine how you’d like to explore DBNL
We’ve made it easy to get started exploring DBNL in a variety of ways:
Hosted Demo Account. Start here if you want to start exploring the DBNL product with pre-populated data in a hosted environment. You won’t have to deploy anything but you also won’t see how data is ingested in the product.
Local Sandbox with Example Data. Start here to install the DBNL SDK and Sandbox locally to create your first project, submit log data to it, and start analyzing. Technical users that want to roll up their sleeves but don’t have project data to work with can start here.
Advanced Data Collection Examples. After completing the Sandbox demo, you can explore how to instrument an agentic system and augment and upload the collected data via this example in our Github.
POC Environment with Your Data. If you would like to start building a POC project using your own data via SQL Integration Ingestion or OTEL Trace Ingestion, start with the full Project Setup docs. Getting going will take longer but you’ll cover more of the fundamentals and have a more robust foundation for future development.
Explore the Product with a Read Only SaaS Account
You can start clicking around the product right away in a pre-provisioned Read Only SaaS account. This organization has pre-populated Projects from our Examples Repo that update daily so that you can explore right away.
Go to app.dbnl.com
Username:
[email protected]Password:
dbnldemo1!
Deploy a Local Sandbox with Example Data
This guide walks you through using the DBNL Sandbox and SDK Log Ingestion using the Python SDK to create your first project, submit log data to it, and start analyzing. See a 3 min walkthrough in our overview video.
For more detailed walkthroughs see the Tutorials.
Get and install the latest DBNL SDK and Sandbox.
The Sandbox runs inside a Docker container and spins up a k3d cluster within it. For more information and full requirements check out the Sandbox Deployment docs.
pip install --upgrade dbnl
dbnl sandbox start
dbnl sandbox logs # See spinup progressLog into the sandbox at http://localhost:8080 using
Username:
adminPassword:
password
Create a Model Connection
Every DBNL Project requires a Model Connection to create LLM-as-judge metrics and perform analysis.
Click on the "Model Connections" tab on the left panel of http://localhost:8080
Click "+ Add Model Connection"
Create a Model Connection with the name:
quickstart_model. After selecting a provider you will be prompted to enter an API Key and model name, this model will be used for Metric generation and Insight generation as part of the Data Pipeline. We suggest cutting a new key with a budget and using a mid-weight model like GPT-OSS-20B.
Create a project and upload example data using the SDK
This example uses real LLM conversation logs from an "Outing Agent" application. The data is publicly available in S3.
You can grab the code from the Quickstart Example in the dbnlAI/examples GitHub repository.
import dbnl
import pandas as pd
import pyarrow.parquet as pq
from datetime import UTC, datetime, timedelta
# Make sure your version matches these docs
print("dbnl version:", dbnl.__version__)
# Login to DBNL
dbnl.login(
api_url="http://localhost:8080/api",
api_token="", # found at http://localhost:8080/tokens
)
# Create a new project
project = dbnl.get_or_create_project(
name="Quickstart Demo",
schedule="daily", # How often DBNL analyzes new data
default_llm_model_name="quickstart_model" # From step (2) above
)
# Load 14 days of real LLM conversation logs from public S3 bucket
base_path = "s3://dbnl-demo-public/outing_agent_log_data"
s3_files = [f"{base_path}/day_{i:02d}.parquet" for i in range(1, 15)]
day_dfs = [pq.read_table(f).to_pandas(types_mapper=pd.ArrowDtype, ignore_metadata=True) for f in s3_files]
# Adjust timestamps to current time so data appears recent
delta = datetime.now(tz=UTC) - day_dfs[-1]["timestamp"].max()
delta = timedelta(days=round(delta / timedelta(days=1)))
for df in day_dfs:
df["timestamp"] = df["timestamp"] + delta
# Upload the data, DBNL needs at least 7 days to establish behavioral baselines
print("Uploading data...")
print(f"See status at: http://localhost:8080/ns/{project.namespace_id}/projects/{project.id}/status")
for idx, day_df in enumerate(day_dfs):
print(f"{idx + 1} / {len(day_dfs)} publishing log data")
data_start_t = min(day_df['timestamp']).replace(hour=0, minute=0, second=0, microsecond=0)
data_end_t = data_start_t + timedelta(days=1)
try:
dbnl.log(
project_id=project.id,
data_start_time=data_start_t,
data_end_time=data_end_t,
data=day_df,
)
except Exception as e:
if "Data already exists" in str(e):
continue
raise
print("You can now explore your data in DBNL!")
print(f"http://localhost:8080/ns/{project.namespace_id}/projects/{project.id}")Discover, investigate, and track behavioral signals
See a 3 min walkthrough in our overview video.
After the data processing completes (check the Status page):
Go back to the DBNL project at http://localhost:8080
Discover your first behavioral signals by clicking on "Insights"
Investigate these insights by clicking on the "Explorer" or "Logs" button
Track interesting patterns by clicking "Add Segment to Dashboard"
Next Steps
Create a Project with your own data using OTEL Trace, SQL, or SDK ingestion with the Data Connections guides.
Learn more about the Adaptive Analytics Workflow.
Deploy the full DBNL platform with the Deployment options.
Need help? Contact [email protected] or visit distributional.com/contact
Was this helpful?

