# Quickstart

### Explore the Product with a Read Only SaaS Account

You can start clicking around the product right away in a pre-provisioned Read Only SaaS account. This organization has pre-populated Projects that update daily that you can explore right away.

Go to [app.dbnl.com](https://app.dbnl.com)

* Username: `demo-user@distributional.com`
* Password: `dbnldemo1!`

### Deploy a Local Sandbox with Example Data

This guide walks you through using the DBNL [Sandbox](https://docs.dbnl.com/v0.27.x/platform/deployment/sandbox) and [SDK Log Ingestion](https://docs.dbnl.com/v0.27.x/configuration/data-connections/sdk-log-ingestion) using the [Python SDK](https://github.com/dbnlAI/docs/blob/main/reference/python-sdk.md) to create your first project, submit log data to it, and start analyzing. See a 3 min walkthrough in our [overview video](https://www.youtube.com/watch?v=DfL-FcB5W6Q).

{% hint style="warning" %}
If you would like to create your first project using [SQL Integration Ingestion](https://docs.dbnl.com/v0.27.x/configuration/data-connections/sql-integration-ingestion) or [OTEL Trace Ingestion](https://docs.dbnl.com/v0.27.x/configuration/data-connections/otel-trace-ingestion) instead of [SDK Log Ingestion](https://docs.dbnl.com/v0.27.x/configuration/data-connections/sdk-log-ingestion) then skip over to the full [Project Setup](https://docs.dbnl.com/v0.27.x/workflow/projects#creating-a-project) docs. The approach and example data below is the fastest way to explore the product locally from scratch.
{% endhint %}

{% stepper %}
{% step %}
**Get and install the latest DBNL SDK and Sandbox.**

{% hint style="info" %}
If you already have the SDK and a deployment of DBNL already running and accessible, just log in and skip to step 2.
{% endhint %}

```bash
pip install --upgrade dbnl
dbnl sandbox start
dbnl sandbox logs # See spinup progress
```

Log into the sandbox at <http://localhost:8080> using

* Username: `admin`
* Password: `password`

{% hint style="info" %}
For more deployment options see [Deployment](https://docs.dbnl.com/v0.27.x/platform/deployment), including more detailed instructions for deploying and administering the [Sandbox](https://docs.dbnl.com/v0.27.x/platform/deployment/sandbox).
{% endhint %}
{% endstep %}

{% step %}
**Create a Model Connection**

Every DBNL Project requires a [Model Connection](https://docs.dbnl.com/v0.27.x/configuration/model-connections) to create LLM-as-judge metrics and perform analysis.

1. Click on the "Model Connections" tab on the left panel of <http://localhost:8080>
2. Click "+ Add Model Connection"
3. [Create a Model Connection](https://docs.dbnl.com/v0.27.x/configuration/model-connections#creating-a-model-connection) with the name: `quickstart_model`
   {% endstep %}

{% step %}
**Create a project and upload example data using the SDK**

This example uses real LLM conversation logs from an "Outing Agent" application. The data is publicly available in S3.

{% file src="<https://content.gitbook.com/content/8N8zzLtIch6ZiTSwWtXD/blobs/15uLydkadeeNOHv1zwAX/DBNL-quickstart.ipynb>" %}

<pre class="language-python"><code class="lang-python">import dbnl
import pandas as pd
import pyarrow.parquet as pq
from datetime import UTC, datetime, timedelta

# Make sure your version matches these docs
print("dbnl version:", dbnl.__version__)

# Login to DBNL
dbnl.login(
    api_url="http://localhost:8080/api",
    api_token="", # found at http://localhost:8080/tokens
)

# Create a new project
project = dbnl.get_or_create_project(
    name="Quickstart Demo",
    schedule="daily",  # How often DBNL analyzes new data
    default_llm_model_name="quickstart_model" # From step (2) above
)

# Load 14 days of real LLM conversation logs from public S3 bucket
base_path = "s3://dbnl-demo-public/outing_agent_log_data"
s3_files = [f"{base_path}/day_{i:02d}.parquet" for i in range(1, 15)]
day_dfs = [pq.read_table(f).to_pandas(types_mapper=pd.ArrowDtype, ignore_metadata=True) for f in s3_files]

# Adjust timestamps to current time so data appears recent
delta = datetime.now(tz=UTC) - day_dfs[-1]["timestamp"].max()
delta = timedelta(days=round(delta / timedelta(days=1)))
for df in day_dfs:
    df["timestamp"] = df["timestamp"] + delta

<strong># Upload the data, DBNL needs at least 7 days to establish behavioral baselines
</strong>print("Uploading data...")
print(f"See status at: http://localhost:8080/ns/{project.namespace_id}/projects/{project.id}/status")
for idx, day_df in enumerate(day_dfs):
    print(f"{idx + 1} / {len(day_dfs)} publishing log data")
    data_start_t = min(day_df['timestamp']).replace(hour=0, minute=0, second=0, microsecond=0)
    data_end_t = data_start_t + timedelta(days=1)
    try:
        dbnl.log(
            project_id=project.id,
            data_start_time=data_start_t,
            data_end_time=data_end_t,
            data=day_df,
        )
    except Exception as e:
        if "Data already exists" in str(e):
            continue
        raise
    
print("You can now explore your data in DBNL!")
print(f"http://localhost:8080/ns/{project.namespace_id}/projects/{project.id}")
</code></pre>

{% hint style="info" %}
After uploading, the data pipeline will run automatically. Depending on the latency of your [Model Connection](https://docs.dbnl.com/v0.27.x/configuration/model-connections), it may take several minutes to complete all steps (Ingest → Enrich → Analyze → Publish). Check the Status page to monitor progress.
{% endhint %}
{% endstep %}

{% step %}
**Discover, investigate, and track behavioral signals**

After the data processing completes (check the Status page):

1. Go back to the DBNL project at <http://localhost:8080>
2. Discover your first behavioral signals by clicking on "Insights"
3. Investigate these insights by clicking on the "Explorer" or "Logs" button
4. Track interesting patterns by clicking "Add Segment to Dashboard"

{% hint style="info" %}
**No Insights appearing?** The system needs at least 7 days of data to establish behavioral baselines. If you just uploaded data, check the Status page to ensure all pipeline steps (Ingest → Enrich → Analyze → Publish) completed successfully.
{% endhint %}
{% endstep %}
{% endstepper %}

## Next Steps

* Create a Project with your own data using OTEL Trace, SQL, or SDK ingestion with the [Data Connections](https://docs.dbnl.com/v0.27.x/configuration/data-connections) guides.
* Learn more about the [Adaptive Analytics Workflow](https://docs.dbnl.com/v0.27.x/workflow/adaptive-analytics-workflow).
* Deploy the full DBNL platform with the [Deployment](https://docs.dbnl.com/v0.27.x/platform/deployment) options.
* Need help? Contact <support@distributional.com> or visit [distributional.com/contact](https://distributional.com/contact)
