SDK Functions

convert_otlp_traces_data

dbnl.convert_otlp_traces_data(data: pd.Series[Any],
	format: Literal['otlp_json',
	'otlp_proto'] | None = None
) → pd.Series[Any]

Converts a Series of OTLP TracesData to a Series of DBNL spans matching the DBNL semantic conventionarrow-up-right.

The resulting Series can be used as is to fill the spans column of a DataFrame to be logged with the dbnl.log function.

For a complete specification of the TracesData format, see the OTLP specificationarrow-up-right

  • Parameters:

    • data – Series of OTLP TracesData

    • format – OTLP TracesData format (otlp_json or otlp_proto) or None to infer from data

  • Returns: Series of spans data

create_llm_model

dbnl.create_llm_model(*,
	name: str,
	description: str | None = None,
	type: Literal['completion',
	'embedding'] | None = 'completion',
	provider: str,
	model: str,
	params: dict[str,
	Any] = {}
) → LLMModel

Create an LLM Model.

  • Parameters:

    • name – Model name

    • description – Model description, defaults to None

    • type – Model type (e.g. completion or embedding), defaults to “completion”

    • provider – Model provider (e.g. openai, bedrock, etc.)

    • model – Model (e.g. gpt-4, gpt-3.5-turbo, etc.)

    • params – Model provider parameters (e.g. api key), defaults to {}

  • Returns: LLM Model

create_metric

Create a new Metric

  • Parameters:

    • project – The Project to create the Metric for

    • name – Name for the Metric

    • expression_template – Expression template string e.g. token_count({RUN}.question)

    • description – Optional description of what computation the metric is performing

    • greater_is_better – Flag indicating whether greater values are semantically ‘better’ than lesser values

  • Raises:

    • DBNLNotLoggedInError – dbnl SDK is not logged in. See loginarrow-up-right.

    • DBNLInputValidationError – Input does not conform to expected format

  • Returns: Created Metric

create_project

Create a new Project

  • Parameters:

    • name – Name for the Project

    • description – Description for the Project, defaults to None. Description is limited to 255 characters.

  • Raises:

    • DBNLNotLoggedInError – dbnl SDK is not logged in. See loginarrow-up-right.

    • DBNLAPIValidationError – dbnl API failed to validate the request

    • DBNLConflictingProjectErrorProject with the same name already exists

  • Returns: Project

Examples:

delete_llm_model

Delete an LLM Model by id.

delete_metric

Delete a Metric by ID

  • Parameters:

    • metric_id – ID of the metric to delete

  • Raises:

    • DBNLNotLoggedInError – dbnl SDK is not logged in. See loginarrow-up-right.

    • DBNLAPIValidationError – dbnl API failed to validate the request

  • Returns: None

flatten_otlp_traces_data

Flattens a Series of OTLP TracesData to a DataFrame matching the DBNL semantic conventionarrow-up-right.

The resulting DataFrame can be used as is to be logged with the dbnl.log function and will included all minimally required columns (timestamp, input, output) as well as the spans column for further flattening server-side.

For a complete specification of the TracesData format, see the OTLP specificationarrow-up-right

  • Parameters:

    • data – Series of OTLP TracesData

    • format – OTLP TracesData format (otlp_json or otlp_proto) or None to infer from data

  • Returns: DataFrame with columns timestamp, input, output, spans

get_llm_model

Get an LLM Model by id.

  • Parameters:

    • llm_model_id – Model id

  • Returns: LLM Model if found

get_llm_model_by_name

Get an LLM Model by name

get_or_create_llm_model

Get an LLM Model by name, or create it if it does not exist.

  • Parameters:

    • name – Model name

    • description – Model description, defaults to None

    • type – Model type (e.g. completion or embedding), defaults to “completion”

    • provider – Model provider (e.g. openai, bedrock, etc.)

    • model – Model (e.g. gpt-4, gpt-3.5-turbo, etc.)

    • params – Model provider parameters (e.g. api key), defaults to {}

  • Returns: Model

get_or_create_project

Get the Project with the specified name or create a new one if it does not exist

  • Parameters:

    • name – Name for the Project

    • description – Description for the Project, defaults to None

  • Raises:

    • DBNLNotLoggedInError – dbnl SDK is not logged in. See loginarrow-up-right.

    • DBNLAPIValidationError – dbnl API failed to validate the request

  • Returns: Newly created or matching existing Project

Examples:

get_project

Retrieve a Project by id.

  • Parameters:

    • id – The id for the existing Project.

  • Raises:

    • DBNLNotLoggedInError – dbnl SDK is not logged in. See loginarrow-up-right.

    • DBNLProjectNotFoundErrorProject with the given name does not exist.

  • Returns: Project

Examples:

get_project_by_name

Retrieve a Project by name.

  • Parameters:

    • name – The name for the existing Project.

  • Raises:

    • DBNLNotLoggedInError – dbnl SDK is not logged in. See loginarrow-up-right.

    • DBNLProjectNameNotFoundErrorProject with the given name does not exist.

  • Returns: Project

Examples:

log

Log data for a date range to a project.

  • Parameters:

    • project – The Project id to send the logs to.

    • data – Pandas DataFrame with the data. See the DBNL Semantic Convention.

    • data_start_time – Data start date.

    • data_end_time – Data end time.

    • wait_timeout – If set, the function will block for up to wait_timeout seconds until the data is done processing, defaults to 10 minutes.

  • Raises:

    • DBNLNotLoggedInError – dbnl SDK is not logged in. See loginarrow-up-right.

    • DBNLInputValidationError – Input does not conform to expected format

Examples:

login

Setup dbnl SDK to make authenticated requests. After login is run successfully, the dbnl client will be able to issue secure and authenticated requests against hosted endpoints of the dbnl service.

  • Parameters:

    • api_token – dbnl API token for authentication; token can be found at /tokens page of the dbnl app. If None is provided, the environment variable DBNL_API_TOKEN will be used by default.

    • namespace_id – The namespace ID to use for the session.

    • api_url – The base url of the Distributional API. By default, this is set to localhost:8080/api, for sandbox users. For other users, please contact your sys admin. If None is provided, the environment variable DBNL_API_URL will be used by default.

    • app_url – An optional base url of the Distributional app. If this variable is not set, the app url is inferred from the DBNL_API_URL variable. For on-prem users, please contact your sys admin if you cannot reach the Distributional UI.

update_llm_model

Update an LLM Model by id.

  • Parameters:

    • llm_model_id – Model id

    • name – Model name

    • description – Model description, defaults to None

    • model – Model (e.g. gpt-4, gpt-3.5-turbo, etc.)

    • params – Model provider parameters (e.g. api key), defaults to {}

  • Returns: Updated LLM Model

Was this helpful?