LogoLogo
AboutBlogLaunch app ↗
v0.20.x
v0.20.x
  • Introduction to AI Testing
  • Welcome to Distributional
  • Motivation
  • What is AI Testing?
  • Stages in the AI Software Development Lifecycle
    • Components of AI Testing
  • Distributional Testing
  • Getting Access to Distributional
  • Learning about Distributional
    • The Distributional Framework
    • Defining Tests in Distributional
      • Automated Production test creation & execution
      • Knowledge-based test creation
      • Comprehensive testing with Distributional
    • Reviewing Test Sessions and Runs in Distributional
      • Reviewing and recalibrating automated Production tests
      • Insights surfaced elsewhere on Distributional
      • Notifications
    • Data in Distributional
      • The flow of data
      • Components and the DAG for root cause analysis
      • Uploading data to Distributional
      • Living in your VPC
  • Using Distributional
    • Getting Started
    • Access
      • Organization and Namespaces
      • Users and Permissions
      • Tokens
    • Data
      • Data Objects
      • Run-Level Data
      • Data Storage Integrations
      • Data Access Controls
    • Testing
      • Creating Tests
        • Test Page
        • Test Drawer Through Shortcuts
        • Test Templates
        • SDK
      • Defining Assertions
      • Production Testing
        • Auto-Test Generation
        • Recalibration
        • Notable Results
        • Dynamic Baseline
      • Testing Strategies
        • Test That a Given Distribution Has Certain Properties
        • Test That Distributions Have the Same Statistics
        • Test That Columns Are Similarly Distributed
        • Test That Specific Results Have Matching Behavior
        • Test That Distributions Are Not the Same
      • Executing Tests
        • Manually Running Tests Via UI
        • Executing Tests Via SDK
      • Reviewing Tests
      • Using Filters
        • Filters in the Compare Page
        • Filters in Tests
    • Python SDK
      • Quick Start
      • Functions
        • login
        • Project
          • create_project
          • copy_project
          • export_project_as_json
          • get_project
          • get_or_create_project
          • import_project_from_json
        • Run Config
          • create_run_config
          • get_latest_run_config
          • get_run_config
          • get_run_config_from_latest_run
        • Run Results
          • get_column_results
          • get_scalar_results
          • get_results
          • report_column_results
          • report_scalar_results
          • report_results
        • Run
          • close_run
          • create_run
          • get_run
          • report_run_with_results
        • Baseline
          • create_run_query
          • get_run_query
          • set_run_as_baseline
          • set_run_query_as_baseline
        • Test Session
          • create_test_session
      • Objects
        • Project
        • RunConfig
        • Run
        • RunQuery
        • TestSession
        • TestRecalibrationSession
        • TestGenerationSession
        • ResultData
      • Experimental Functions
        • create_test
        • get_tests
        • get_test_sessions
        • wait_for_test_session
        • get_or_create_tag
        • prepare_incomplete_test_spec_payload
        • create_test_recalibration_session
        • wait_for_test_recalibration_session
        • create_test_generation_session
        • wait_for_test_generation_session
      • Eval Module
        • Quick Start
        • Application Metric Sets
        • How-To / FAQ
        • LLM-as-judge and Embedding Metrics
        • RAG / Question Answer Example
        • Eval Module Functions
          • Index of functions
          • eval
          • eval.metrics
    • Notifications
    • Release Notes
  • Tutorials
    • Instructions
    • Hello World (Sentiment Classifier)
    • Trading Strategy
    • LLM Text Summarization
      • Setting the Scene
      • Prompt Engineering
      • Integration testing for text summarization
      • Practical considerations
Powered by GitBook

© 2025 Distributional, Inc. All Rights Reserved.

On this page
  • Overview
  • Setting up a Notification Channel in your Namespace
  • Setting up a Notification in your Project

Was this helpful?

Export as PDF
  1. Using Distributional

Notifications

Previouseval.metricsNextRelease Notes

Was this helpful?

Overview

Notifications provide a way for users to be automatically notified about critical test events (e.g., failures or completions) via third-party tools like PagerDuty.

With Notifications you can:

  • Add critical test failure alerting to your organization’s on-call

  • Create custom notifications for specific feature tests

  • Stay informed when a new test session has started

Quick Links

Setting up a Notification Channel in your Namespace

A Notification Channel describes who will be notified and how.

Before setting up a Notification in your project, you must have a Notification Channel set up in your Namespace. Notification Channels on a Namespace can be used by Notifications in all Projects belonging to the Namespace.

In adding your Notification Channel you will be able to select which you'd like to be notified through.

  1. In your desired Namespace, choose Notification Channels in the menu sidebar.

    Note: you must be a Namespace admin in order to do this.

  2. Click the New Notification Channel button to navigate to the creation form.

  3. Complete the appropriate fields.

    Optional: If you’d like to test that your Notification Channel is set up correctly, click the Test button. If it is correctly set up, you should receive a notification through the integration you’ve selected.

  4. Click the Create Notification Channel button. Your channel will now be available when setting up your Notification.

Supported Third-Party Notification Channels

Note: More coming up in the product roadmap

Setting up a Notification in your Project

  1. Navigate to the Project page of your desired project.

  2. Under Test Sessions, click the Configure Notifications button to navigate to the Project’s Notifications page.

  3. Click the New Notification button to take you to the creation form.

  4. Click the Create Notification button. Your Notification will now notify you when your specified criteria are met.

Notification Criteria

Trigger Event

The trigger event describes when your Notification is initiated. Trigger events are based on Test Session outcomes.

Tags Filtering

Filtering by Tags allows you to define which tests in the Test Session you care to be notified about.

There are three types of Tags filters you can provide:

Include: Must have ANY of the selected

Exclude: Must not have ANY of the selected

Require: Must have ALL of the selected

When multiple types are provided, all filters are combined using ‘AND’ logic, meaning all conditions must be met simultaneously.

Note: This field only pertains to the ‘Test Session Failed’ trigger event

Condition

The condition describes the threshold at which you care to be notified. If the condition is met, your Notification will be sent.

Note: This field only pertains to the ‘Test Session Failed’ trigger event

Set your Notification’s name, , and Notification Channels.

PagerDuty
Slack
criteria
Setting up a Notification Channel in your Namespace
Supported Third-Party Notification Channels
Setting up a Notification in your Project
Notification Criteria
third-party integration
Notification Channel creation form
Notification creation form