SDK Functions

convert_otlp_traces_data

dbnl.convert_otlp_traces_data(data: pd.Series[Any],
	format: Literal['otlp_json',
	'otlp_proto'] | None = None
) → pd.Series[Any]

Converts a Series of OTLP TracesData to DBNL spans.

  • Parameters:

    • **data** – Series of OTLP TracesData

    • format – OTLP TracesData format (otlp_json or otlp_proto) or None to infer from data

  • Returns: Series of spans data

create_llm_model

dbnl.create_llm_model(*,
	name: str,
	description: str | None = None,
	type: Literal['completion',
	'embedding'] | None = 'completion',
	provider: str,
	model: str,
	params: dict[str,
	Any] = {}
) → LLMModel

Create an LLM Model.

  • Parameters:

    • **name** – Model name

    • description – Model description, defaults to None

    • type – Model type (e.g. completion or embedding), defaults to “completion”

    • provider – Model provider (e.g. openai, bedrock, etc.)

    • model – Model (e.g. gpt-4, gpt-3.5-turbo, etc.)

    • params – Model provider parameters (e.g. api key), defaults to {}

  • Returns: LLM Model

create_metric

dbnl.create_metric(*,
	project: Project,
	name: str,
	expression_template: str,
	description: str | None = None,
	greater_is_better: bool | None = None
) → Metric

Create a new Metric

  • Parameters:

    • **project** – The Project to create the Metric for

    • name – Name for the Metric

    • expression_template – Expression template string e.g. token_count({RUN}.question)

    • description – Optional description of what computation the metric is performing

    • greater_is_better – Flag indicating whether greater values are semantically ‘better’ than lesser values

  • Raises:

    • DBNLNotLoggedInError – dbnl SDK is not logged in. See login.

    • DBNLInputValidationError – Input does not conform to expected format

  • Returns: Created Metric

create_project

dbnl.create_project(*,
	name: str,
	description: str | None = None,
	schedule: Literal['daily',
	'hourly'] | None = 'daily',
	default_llm_model_id: str | None = None,
	default_llm_model_name: str | None = None,
	template: Literal['default'] | None = 'default'
) → Project

Create a new Project

  • Parameters:

    • **name** – Name for the Project

    • description – Description for the Project, defaults to None. Description is limited to 255 characters.

  • Raises:

    • DBNLNotLoggedInError – dbnl SDK is not logged in. See login.

    • DBNLAPIValidationError – dbnl API failed to validate the request

    • DBNLConflicting[Project](classes.md#Project)ErrorProject with the same name already exists

  • Returns: Project

Examples:

import dbnl

dbnl.login()

proj_1 = dbnl.create_project(name="test_p1")

# DBNLConflictingProjectError: A Project with name test_p1 already exists.
proj_2 = dbnl.create_project(name="test_p1")

delete_llm_model

dbnl.delete_llm_model(*,
	llm_model_id: str
) → None

Delete an LLM Model by id.

delete_metric

dbnl.delete_metric(*,
	metric_id: str
) → None

Delete a Metric by ID

  • Parameters:

    • metric_id – ID of the metric to delete

  • Raises:

    • DBNLNotLoggedInError – dbnl SDK is not logged in. See login.

    • DBNLAPIValidationError – dbnl API failed to validate the request

  • Returns: None

get_llm_model

dbnl.get_llm_model(*,
	llm_model_id: str
) → LLMModel

Get an LLM Model by id.

  • Parameters:

    • llm_model_id – Model id

  • Returns: LLM Model if found

get_llm_model_by_name

dbnl.get_llm_model_by_name(*,
	name: str
) → LLMModel

Get an LLM Model by name

get_or_create_llm_model

dbnl.get_or_create_llm_model(*,
	name: str,
	description: str | None = None,
	type: Literal['completion',
	'embedding'] | None = 'completion',
	provider: str,
	model: str,
	params: dict[str,
	Any] = {}
) → LLMModel

Get an LLM Model by name, or create it if it does not exist.

  • Parameters:

    • **name** – Model name

    • description – Model description, defaults to None

    • type – Model type (e.g. completion or embedding), defaults to “completion”

    • provider – Model provider (e.g. openai, bedrock, etc.)

    • model – Model (e.g. gpt-4, gpt-3.5-turbo, etc.)

    • params – Model provider parameters (e.g. api key), defaults to {}

  • Returns: Model

get_or_create_project

dbnl.get_or_create_project(*,
	name: str,
	description: str | None = None,
	schedule: Literal['daily',
	'hourly'] | None = 'daily',
	default_llm_model_id: str | None = None,
	default_llm_model_name: str | None = None,
	template: Literal['default'] | None = 'default'
) → Project

Get the Project with the specified name or create a new one if it does not exist

  • Parameters:

    • **name** – Name for the Project

    • description – Description for the Project, defaults to None

  • Raises:

    • DBNLNotLoggedInError – dbnl SDK is not logged in. See login.

    • DBNLAPIValidationError – dbnl API failed to validate the request

  • Returns: Newly created or matching existing Project

Examples:

import dbnl

dbnl.login()

proj_1 = dbnl.create_project(name="test_p1")
proj_2 = dbnl.get_or_create_project(name="test_p1")

# Calling get_or_create_project will yield same Project object
assert proj_1.id == proj_2.id

get_project

dbnl.get_project(*,
	name: str
) → Project

Retrieve a Project by name.

  • Parameters:

    • name – The name for the existing Project.

  • Raises:

    • DBNLNotLoggedInError – dbnl SDK is not logged in. See login.

    • DBNL[Project](classes.md#Project)NotFoundErrorProject with the given name does not exist.

  • Returns: Project

Examples:

import dbnl

dbnl.login()

proj_1 = dbnl.create_project(name="test_p1")
proj_2 = dbnl.get_project(name="test_p1")

# Calling get_project will yield same Project object
assert proj_1.id == proj_2.id

# DBNLProjectNotFoundError: A dnnl Project with name not_exist does not exist
proj_3 = dbnl.get_project(name="not_exist")

log

dbnl.log(*,
	project_id: str,
	data: DataFrame,
	data_start_time: datetime,
	data_end_time: datetime,
	wait_timeout: float | None = 600
) → None

Log data for a date range to a project.

  • Parameters:

    • **project** – The Project id to send the logs to.

    • data – Pandas DataFrame with the data.

    • data_start_time – Data start date.

    • data_end_time – Data end time.

    • wait_timeout – If set, the function will block for up to wait_timeout seconds until the data is done processing, defaults to 10 minutes.

  • Raises:

    • DBNLNotLoggedInError – dbnl SDK is not logged in. See login.

    • DBNLInputValidationError – Input does not conform to expected format

Examples:

from datetime import datetime, UTC

import dbnl
import pandas as pd

dbnl.login()

proj = dbnl.get_or_create_project(name="test")
test_data = pd.DataFrame(
    {"timestamp": [datetime(2025, 1, 1, 11, 39, 53, tzinfo=UTC)]}
    {"input": ["Hello"]}
    {"output": ["World"]}
)

run = dbnl.log(
    project=proj,
    column_data=test_data,
    data_start_time=datetime(2025, 1, 1, tzinfo=UTC),
    data_end_time=datetime(2025, 1, 2, tzinfo=UTC),
)

login

dbnl.login(*,
	api_token: str | None = None,
	api_url: str | None = None,
	app_url: str | None = None,
	verify: bool = True
) → None

Setup dbnl SDK to make authenticated requests. After login is run successfully, the dbnl client will be able to issue secure and authenticated requests against hosted endpoints of the dbnl service.

  • Parameters:

    • **api_token** – dbnl API token for authentication; token can be found at /tokens page of the dbnl app. If None is provided, the environment variable DBNL_API_TOKEN will be used by default.

    • namespace_id – The namespace ID to use for the session.

    • api_url – The base url of the Distributional API. By default, this is set to localhost:8080/api, for sandbox users. For other users, please contact your sys admin. If None is provided, the environment variable DBNL_API_URL will be used by default.

    • app_url – An optional base url of the Distributional app. If this variable is not set, the app url is inferred from the DBNL_API_URL variable. For on-prem users, please contact your sys admin if you cannot reach the Distributional UI.

update_llm_model

dbnl.update_llm_model(*,
	llm_model_id: str,
	name: str | None = None,
	description: str | None = None,
	model: str | None = None,
	params: dict[str,
	Any] | None = None
) → LLMModel

Update an LLM Model by id.

  • Parameters:

    • **llm_model_id** – Model id

    • name – Model name

    • description – Model description, defaults to None

    • model – Model (e.g. gpt-4, gpt-3.5-turbo, etc.)

    • params – Model provider parameters (e.g. api key), defaults to {}

  • Returns: Updated LLM Model

Was this helpful?