Skip to main content
Version: 3.0.0

STD: Notification Indicators (NOTIF)

Version: v1.0.0 Status: Draft SRS Source: docusaurus/docs/srs/notif.md Domain: NOTIF


Overview

This document specifies tests for the Notification Indicators domain, which covers runfile status notification display, interactive filtering, and comment counting configuration.

Domain Characteristics:

  • Primary function: UI display and interaction
  • Secondary function: Backend aggregation (counts, filtering)
  • Configuration dependency: Comment counting setting

Test Method Rationale: Per Test Plan §3.3, UI Features domains use TM-UI as primary method with TM-MAN fallback. However, NOTIF has significant backend logic (count aggregation, filtering), so TM-API is used for count verification while TM-UI handles interaction flows.

Test Case Convention: Steps describe logical actions, not UI mechanics. Use "Navigate to Run Files page" or "Select notification filter", not "Click the Runs link in the sidebar" or "Click button with id=filter-btn". This ensures test intent survives UI redesigns.


Coverage Summary

REQ IDTitleACsTestsAC CoverageMethodGaps
REQ-NOTIF-001Display Status Notifications5TC-NOTIF-001, TC-NOTIF-0025/5 (100%)TM-UINone
REQ-NOTIF-002Filter by Notification Selection4TC-NOTIF-003, TC-NOTIF-0044/4 (100%)TM-UINone
REQ-NOTIF-003Associate Control Error Separation4TC-NOTIF-005, TC-NOTIF-006, TC-NOTIF-0074/4 (100%)TM-UINone
REQ-NOTIF-004Comment Counting Configuration6TC-NOTIF-008, TC-NOTIF-009, TC-NOTIF-0106/6 (100%)TM-HYBNone
REQ-NOTIF-005Mix-Specific Label Error Counts4TC-NOTIF-0114/4 (100%)TM-UINone

Totals: 5 REQs, 23 ACs, 11 Test Cases, 100% Coverage


Test Cases

TC-NOTIF-001: Notification indicators display in Runs Table

Verifies: REQ-NOTIF-001 (AC1, AC3)

Method: TM-UI

Priority: High

Preconditions:

  • User logged in with at least Junior User role
  • At least one runfile exists with wells in various status categories

Test Data:

  • Runfile with: 3 unrecognised wells, 2 error wells, 1 warning, 5 exportable, 2 commented

Steps:

  1. Navigate to Run Files page
  2. Observe the Notifications column for the test runfile

Expected Results:

  • AC1: Notification indicators visible in runfile list Notifications column
  • AC3: Each indicator displays count matching well status (3, 2, 1, 5, 2)

Automation Status: Automated

Jira: BT-2727


TC-NOTIF-002: Notification category identification via tooltip

Verifies: REQ-NOTIF-001 (AC2, AC4, AC5)

Method: TM-UI

Priority: Medium

Preconditions:

  • User on Run Files page or Run Page with notification indicators visible

Steps:

  1. Hover over each notification icon type
  2. Observe tooltip text

Expected Results:

  • AC4: Category identification provided for each notification icon
  • AC5: Supported categories include:
    • "Unrecognised wells"
    • "Error wells"
    • "Warnings"
    • "Exportable wells"
    • "Commented wells"
  • AC2: Notification counts displayed on individual runfile pages (navigate and verify)

Automation Status: Automated

Jira: BT-2727


TC-NOTIF-003: Selectable vs non-selectable notifications

Verifies: REQ-NOTIF-002 (AC1, AC2)

Method: TM-UI

Priority: High

Preconditions:

  • Run with mixed notification counts (some zero, some non-zero)

Test Data:

  • Run with: 5 unrecognised, 3 errors, 0 warnings, 2 exportable, 0 comments

Steps:

  1. Navigate to Run Page for test runfile
  2. Observe notification indicators
  3. Attempt to click each indicator

Expected Results:

  • AC1: Non-zero counts (unrecognised=5, errors=3, exportable=2) are selectable
  • AC2: Selectable notifications have visual indicator (e.g., underlined, pointer cursor)
  • Zero counts (warnings=0, comments=0) are NOT selectable

Automation Status: Automated

Jira: BT-2473


TC-NOTIF-004: Filter wells by notification selection

Verifies: REQ-NOTIF-002 (AC3, AC4)

Method: TM-UI

Priority: High

Preconditions:

  • Run with 96 total wells, 5 unrecognised wells

Steps:

  1. Navigate to Run Page
  2. Verify all 96 wells displayed initially
  3. Click unrecognised wells notification (count=5)
  4. Observe wells table

Expected Results:

  • AC3: Wells display filters to show only unrecognised wells
  • AC4: Exactly 5 wells displayed after filter applied

Automation Status: Automated

Jira: BT-2473


TC-NOTIF-005: Associate Control Error in Runs Table

Verifies: REQ-NOTIF-003 (AC1)

Method: TM-UI

Priority: High

Preconditions:

  • Run with at least one well having Error Type: Associate Control Error

Test Data:

  • Run A with Well A1: Error Type = Associate Control Error

Steps:

  1. Navigate to Run Files page
  2. Locate Run A in the list
  3. Observe Notifications column

Expected Results:

  • AC1: Associate Control Error notification visible with count = 1

Automation Status: Automated

Jira: BT-3836


TC-NOTIF-006: Associate Control Error on Run Page

Verifies: REQ-NOTIF-003 (AC2, AC4)

Method: TM-UI

Priority: High

Preconditions:

  • Same as TC-NOTIF-005

Steps:

  1. Open Run A
  2. Observe notification indicators

Expected Results:

  • AC2: Associate Control Error notification shows count = 1
  • AC4: Notification visible to all user roles (test with Junior, Senior, Super Admin)

Automation Status: Automated

Jira: BT-3836


TC-NOTIF-007: Associate Control Error excluded from general Error count

Verifies: REQ-NOTIF-003 (AC3)

Method: TM-UI

Priority: Critical

Preconditions:

  • Run with:
    • Well A1: Error Type = Associate Control Error
    • Well A2: Error Type = Standard Error

Steps:

  1. View notifications for the run
  2. Observe both Associate Control Error and general Error counts

Expected Results:

  • AC3: Associate Control Error count = 1
  • AC3: General Error count = 1 (NOT 2)
  • Counts are independent and do not overlap

Automation Status: Automated

Jira: BT-3836


TC-NOTIF-008: Comment count with system comments disabled

Verifies: REQ-NOTIF-004 (AC1, AC2, AC6)

Method: TM-HYB (API for config, UI for verification)

Priority: High

Preconditions:

  • Admin access to modify configuration

Test Data:

  • Run 1 with:
    • Well A1: user comment "sample comment"
    • Well A2: system comment "exported"

Steps:

  1. Set configuration "Count system generated comments" = false (API/Admin UI)
  2. Navigate to Run Files page
  3. Observe Run 1 comment notification count
  4. Open Run 1 and verify notification

Expected Results:

  • AC1: Configuration option controls system comment counting
  • AC2: With setting=false, only user comments counted
  • AC6: Comment count displays 1 (not 2) in both list and detail views

Automation Status: Automated (Browser)

Jira: BT-4328


TC-NOTIF-009: Comment count with system comments enabled

Verifies: REQ-NOTIF-004 (AC3, AC6)

Method: TM-HYB

Priority: High

Preconditions:

  • Same test data as TC-NOTIF-008

Steps:

  1. Set configuration "Count system generated comments" = true
  2. Navigate to Run Files page
  3. Observe Run 1 comment notification count
  4. Open Run 1 and verify notification

Expected Results:

  • AC3: With setting=true, all active comments counted (including system)
  • AC6: Comment count displays 2 in both list and detail views

Automation Status: Automated (Browser)

Jira: BT-4328


TC-NOTIF-010: Deleted and edited comments handling

Verifies: REQ-NOTIF-004 (AC4, AC5)

Method: TM-HYB

Priority: Medium

Preconditions:

  • Run with Well A1 (no comments initially)

Steps:

  1. Add comment "test comment" on A1
  2. Verify notification count = 1
  3. Edit comment to "test comment edited"
  4. Verify notification count = 1 (not 2)
  5. Delete the comment
  6. Verify notification count = 1 (audit trail record)

Expected Results:

  • AC4: Deleted comments not counted (but audit trail may persist)
  • AC5: Edited comments count as one (not multiple versions)

Automation Status: Manual (complex state transitions)

Jira: BT-4328

Deviation: TM-MAN used instead of TM-HYB due to complex state verification across edit/delete cycles. Remediation: Consider parameterized API test for comment state machine.


TC-NOTIF-011: Mix-level label error counts

Verifies: REQ-NOTIF-005 (AC1, AC2, AC3, AC4)

Method: TM-UI

Priority: Medium

Preconditions:

  • Run A with wells across multiple mixes

Test Data:

  • Well A1: Mix A, Error Type = Label Error
  • Well A2: Mix A, Error Type = Standard Error
  • Well A3: Mix B, Error Type = Label Error
  • Well A4: Mix B, Error Type = Label Error
  • Well A5: Mix B, Error Type = Standard Error

Steps:

  1. Navigate to Run Files page
  2. Expand Run A row to show mix details
  3. Observe per-mix notification counts

Expected Results:

  • AC1: Expanded view shows per-mix notification information
  • AC2: Mix A shows Label Errors: 1
  • AC2: Mix B shows Label Errors: 2
  • AC3: Label Error counts include only Label Error type
  • AC4: Standard Errors (A2, A5) NOT included in Label Error counts

Automation Status: Automated (Browser)

Jira: BT-4119


Gap Analysis

No gaps identified. All 23 acceptance criteria have test coverage.

Coverage by AC Type

AC CategoryCountCoveredNotes
Display/Rendering88Verified via TM-UI
Interaction/Filtering66Verified via TM-UI
Configuration33Verified via TM-HYB
Counting Logic66Verified via TM-UI with specific test data

Traceability to Existing Tests

Test CaseJira TestAutomation
TC-NOTIF-001, TC-NOTIF-002BT-2727Selenium
TC-NOTIF-003, TC-NOTIF-004BT-2473Selenium
TC-NOTIF-005, TC-NOTIF-006, TC-NOTIF-007BT-3836Selenium
TC-NOTIF-008, TC-NOTIF-009, TC-NOTIF-010BT-4328Behat/Manual
TC-NOTIF-011BT-4119Selenium

Notes

  • NOTIF is a UI-heavy domain but with backend count aggregation logic
  • Existing Gherkin tests in SRS cover the same scenarios; this STD formalizes the method and coverage
  • TC-NOTIF-010 is the only manual test due to complex state transitions