Version Update Workflow
Master guide for updating all documentation when transitioning between product versions.
Overview
This workflow ensures all documentation stays synchronized when creating a new version (e.g., v3.0 → v3.1). It covers SRS (requirements), SDS (design), STD (tests), and code-to-requirement mappings.
Phase 0: Confluence Ingestion (Conditional)
Goal: Convert raw Confluence exports into structured input for the update pipeline.
When to run: Only if the new version includes requirements documented in Confluence that haven't been converted to SRS format yet. Skip this phase if all changes come from code diffs and existing structured docs.
Reference: See SRS Confluence Conversion Guide for the full 6-stage pipeline.
Pipeline:
Confluence Export (.docx) → Pandoc+Lua (.adoc) → Source Analysis → Classification → Transformation → Validation
Key Steps:
- Export page tree from Confluence as Word (.docx) — not PDF, not HTML
- Run
docusaurus/scripts/docx2adoc.sh input.docx output.adocto convert to AsciiDoc (preserves tables that Markdown breaks) - Inventory and classify content using SRS Authoring Guide taxonomy
- Transform into canonical SRS format
- Validate with Confluence Validation Spec (44 automated checks)
- Integrate into
docusaurus/docs/srs/
Why AsciiDoc? Confluence exports contain complex tables (merged cells, nested content). Pandoc's AsciiDoc writer preserves table structure faithfully. Pandoc's Markdown writer collapses it.
Output: Structured SRS files ready for the Impact Analysis phase.
Phase 0.5: Code Upgrade
Goal: Get the actual new-version code running locally so all subsequent phases work against real behavior.
When to run: Always. You need v{NEW} code to analyze changes, run tests, and verify documentation accuracy.
Pre-Flight Checklist
Before touching any code, complete these steps in order:
- Commit all local changes — Run
git statusand commit everything. During the v3.0.1 upgrade, uncommitted local changes made it difficult to distinguish agent contamination from intentional modifications. - Back up
.env— Copycode/.envtocode/.env.test-baselinebefore any changes. This preserves Cognito vars, DB config, Pusher keys, and other environment-specific settings that are easy to lose during a merge. - Inventory local modifications — List all files with local changes beyond upstream (see Known Local Changes below). These files require manual merges rather than simple overwrites.
- Verify
ghCLI access — Confirm you can fetch upstream files:gh api repos/ORG/REPO/contents/README.md?ref=v{NEW} -H 'Accept: application/vnd.github.raw' > /dev/null
Known Local Changes to Merge
These files have local modifications that must be preserved during upgrade. Do NOT overwrite them blindly from upstream:
| File | Local Change | Merge Strategy |
|---|---|---|
code/features/bootstrap/FeatureContext.php | Browser test step definitions (ours) + upstream new steps | Manual merge — keep our steps, add theirs |
code/features/bootstrap/BrowserContext.php | Entirely ours — browser test infrastructure | Do not overwrite (not in upstream) |
code/composer.json / code/composer.lock | Browser testing packages (dmore/chrome-mink-driver, etc.) | Merge dependency changes carefully |
code/database/seeders/UsersTableSeeder.php | logged_in_site_id fix for strict :string return | Check if upstream fixed this; if not, preserve our fix |
code/.env | Cognito env vars, Pusher keys, DB config for browser tests | Never overwrite — merge new vars into existing |
code/behat.yml | Browser suite configuration | Merge if upstream modifies |
Process
-
Fetch files from upstream tag
# Fetch individual files (preserves directory structure)
gh api 'repos/ORG/REPO/contents/PATH?ref=v{NEW}' \
-H 'Accept: application/vnd.github.raw' > local/PATH
# Or fetch commit list to understand what changed
git log v{OLD}..v{NEW} --oneline -
Merge local changes — If you have local modifications (e.g., FeatureContext.php with custom browser steps), merge carefully:
- Identify files modified locally vs. upstream
- For shared files (e.g., FeatureContext.php), merge both sets of changes
- For upstream-only files, overwrite local copies
-
Frontend rebuild (if frontend files changed)
cd code/resources/frontend && npm install && npm run build -
Run migrations across all pool DBs
for i in $(seq -w 1 10); do
DB_HOST=127.0.0.1 DB_AUDIT_HOST=127.0.0.1 DB_DATABASE=pcrai_test_$i \
DB_USERNAME=sail DB_PASSWORD=password \
php artisan migrate:fresh \
--path=database/migrations \
--path=database/migrations/app \
--path=database/migrations/audit --seed
done -
Code contamination audit — Verify all fetched files match upstream exactly. Watch for agent-added debug code, extra docblocks, or stale local modifications that weren't cleaned up.
-
Create
.env.test-baseline— Backup of working test environment config before any further changes:cp code/.env code/.env.test-baseline -
Update test infrastructure — After code is upgraded:
- Update
KNOWN_VERSION_TAGSandLATEST_VERSION_TAGintests/scripts/behat-optimizer.pyto include the new version tag (e.g.,V3_0_1) - Run
php artisan migrate:fresh --path=database/migrations --path=database/migrations/app --path=database/migrations/audit --seedon ALL pool DBs (01-10), not just one - Rebuild frontend if any JS/Vue files changed:
cd code/resources/frontend && npm install && npm run build - Verify no code contamination: compare all fetched files against upstream tag content using
gh apito spot agent-added debug code, extra docblocks, or stale local edits
- Update
Lesson learned (Session 35): The code upgrade is a prerequisite for all subsequent phases. Without v{NEW} code, change detection operates on stale diffs, test runs use old behavior, and documentation risks describing the wrong version.
Lesson learned (Session 36): Code contamination is a real risk. During v3.0.1 upgrade, 3 PHP files contained agent-added debug file_put_contents() calls and 3 debug log files (22MB) were created. Always run a contamination audit after agent-driven code changes.
Phase 1: Change Detection
Goal: Identify all changes between versions and produce a Change Manifest.
Inputs:
- PRD or product specification updates
- Ideas/instructions documents
- Code diffs between version tags
- Test diffs (when STD is established)
Process:
-
Review PRD/Ideas
- Check
input/for new specification documents - Review any stakeholder change requests
- Document new features, modified behaviors, deprecations
- Check
-
Analyze Code Changes
# Get high-level diff stats
git diff v{OLD}..v{NEW} --stat
# Backend changes
git diff v{OLD}..v{NEW} -- app/
# Frontend changes
git diff v{OLD}..v{NEW} -- resources/js/
# Commit history
git log v{OLD}..v{NEW} --oneline -
Review Test Changes
- Identify new, modified, or removed Behat feature files
- Check for changed test fixtures (
tests/support_files/) - Note any new
@TV-or@TC-tags in feature files
git diff v{OLD}..v{NEW} -- tests/
git diff v{OLD}..v{NEW} -- code/features/
Output: Change Manifest (see Change Detection for template)
Phase 2: Impact Analysis
Goal: Map each change to affected documentation.
Process:
-
For each item in Change Manifest, determine:
- Which SRS domain(s) are affected
- Which SDS section(s) are affected
- Which STD test plans are affected
- Whether change is cross-cutting (security, audit, config)
-
Create Impact Matrix:
| Change ID | Type | Description | SRS Impact | SDS Impact | STD Impact |
|---|---|---|---|---|---|
| CHG-001 | NEW | New export format | fileimport.md | sds-domain-fileimport.md | std-fileimport.md |
| CHG-002 | MOD | Auth flow change | user-management.md | sds-05-security-architecture.md | std-user-management.md |
| CHG-003 | DEP | Legacy API removed | client-config.md | sds-ref-api.md | std-client-config.md |
- Identify cross-cutting concerns:
- Security changes → sds-05-security-architecture.md + affected SRS domains
- Config changes → sds-ref-config.md + kitcfg.md/client-config.md
- Audit changes → sds-domain-audit.md + audit-log.md
- API changes → sds-ref-api.md + affected SRS domains
Phase 3: SRS Updates
Goal: Update requirements documentation.
Reference: See SRS Update Guide for detailed instructions.
Summary of Actions:
| Change Type | Action |
|---|---|
| NEW feature | Add new REQ-DOMAIN-NNN using SRS authoring guide |
| MODIFIED behavior | Update existing requirement (preserve ID) |
| DEPRECATED feature | Move to archive section with deprecation note |
Key Rules:
- REQ IDs are immutable - never change an existing ID
- Document all changes in Reviewer Notes section
- Update Traceability Matrix for each change
- Add Implementation section if code exists
Phase 4: SDS Updates
Goal: Update design documentation.
Reference: See SDS Update Guide for detailed instructions.
Mapping:
| SDS File | Update For |
|---|---|
| sds-03-architecture-overview.md | Component changes, system boundaries |
| sds-04-data-architecture.md | ERD, schema changes |
| sds-05-security-architecture.md | Auth, permissions, token handling |
| sds-06-cross-cutting.md | Logging, errors, infrastructure |
| sds-domain-*.md | Domain-specific behavior changes |
| sds-rules-*.md | Rule logic, precedence changes |
| sds-ref-api.md | API endpoint changes |
| sds-ref-database.md | Schema documentation |
| sds-ref-config.md | Configuration options |
| sds-ref-glossary.md | New terms |
Key Rules:
- Add
{#anchor-name}for new sections - Update cross-references between SDS files
- Maintain Mermaid diagrams if architectural changes
- Update Implementation Mapping in domain docs
Phase 5: STD Updates
Goal: Update test documentation.
Reference: See STD Update Guide for detailed instructions.
Actions:
- Add test cases for new requirements
- Update test cases for modified requirements
- Archive tests for deprecated requirements
- Update traceability matrix
- Verify coverage targets (100% REQ, 80%+ AC)
5b. Regenerate Unified Traceability
After completing STD updates, regenerate the unified traceability artifacts:
python3 docusaurus/scripts/generate-unified-traceability.py --render-md
This refreshes the unified traceability JSON, coverage report, traceability matrix, SDS traceability view, and release checklist.
5c. Implement Behat Tests
After STD specifications are updated, implement the test vectors as executable Behat scenarios.
Reference: See Behat Test Creation Guide for the full creation workflow, gotchas, and subagent patterns.
Process:
- Reconcile coverage — Map new/changed TVs to existing Behat tests using STD Reconciliation Guide
- Create API tests — For TM-API test vectors, create Behat feature files with fixtures (see Behat Creation Guide)
- Create browser tests — For TM-UI test vectors, create Behat/Mink scenarios (see Browser Test Guide)
- QR pass — Run quality review on all new scenarios (see Agent Orchestration Guide)
- Update dashboard — Update V3 Testing Dashboard with new test counts
Orchestration: For large batches (>10 new test vectors), use wave-based parallel execution. See Agent Orchestration Guide for wave planning, DB pool management, and QR protocol.
5d. Version Tagging for Tests
When behavior changes between versions, tests must be tagged so the correct set runs per deployment.
Process:
- Identify behavior changes — For each changed requirement, determine whether the old and new behaviors are both valid (different clients may run different versions)
- Create version-split scenarios:
- Copy the original scenario with a
@V{OLD}tag (e.g.,@V3_0_0) and old assertions - Update the original with a
@V{NEW}tag (e.g.,@V3_0_1) and new assertions - Both versions share the same BT key, TV tags, and fixture files — only assertions differ
- Copy the original scenario with a
- Universal tests — Tests that pass on ALL versions get NO version tag
- Tag format:
@V3_0_0,@V3_0_1(underscores, not dots) - Mirror changes to
new_tests/directory (git subtree tracking)
Running version-specific tests:
# Run only tests valid for a specific version
python3 behat-optimizer.py run --target-version 3.0.1 --suite all
5e. Regression Testing
After implementing new/changed Behat tests, run a full regression to catch breakage.
Process:
-
Full suite with version filter:
python3 behat-optimizer.py run --target-version X.Y.Z --suite all --rerun-failures -
Classify failures:
- Pre-existing: Already tagged
@KNOWN_CODE_ISSUEor@KNOWN_LIMITATION— verify still valid - New regressions: Caused by v{NEW} code changes — investigate and fix
- Flaky: Pass on rerun — typically Chrome/CDP transient issues
- Environment: DB pool, artisan serve, missing migrations — fix and rerun
- Pre-existing: Already tagged
-
KCI/KL audit: Check if existing
@KNOWN_CODE_ISSUEor@KNOWN_LIMITATIONtags are now stale (bugs fixed in v{NEW}) -
Browser test regression:
- Ensure
.envhas correct Cognito vars, Pusher keys, andPHP_CLI_SERVER_WORKERS=8 - The optimizer's
rerun_browser_failureshandles transient Chrome/CDP flakes
- Ensure
-
Exit criteria: All new failures either fixed, tagged KCI/KL with justification, or classified as pre-existing
Lesson learned (Session 35): The v3.0.0→v3.0.1 upgrade revealed 30 new failures across 2,496 scenarios. Classifying them into categories (19 WREP edited-wells, 4 flaky INHN/INHP, 5 pre-existing resolution guard, etc.) was essential for triage.
Phase 6: Code Mapping Sync
Goal: Maintain bidirectional code-to-docs traceability.
Reference: See Code Mapping Update for detailed instructions.
Registry: tests/catalogue/code_tags.json — single source of truth for code-to-requirement mappings.
Actions:
-
For new code implementing new requirements:
- Add entry in
code_tags.json(all three views:by_file,by_srs,by_sdd) - Add Implementation section to SRS file
- Add entry in
-
For modified code:
- Update existing
code_tags.jsonentries if REQ scope changed - Update Implementation sections in SRS
- Update existing
-
For removed code:
- Remove orphaned entries from
code_tags.json(all three views) - Update Implementation sections (mark as deprecated)
- Remove orphaned entries from
Phase 6b: Regenerate Unified Traceability
After all doc updates are complete, regenerate the unified traceability JSON and views:
python3 docusaurus/scripts/generate-unified-traceability.py --render-md
This updates:
tests/catalogue/unified-traceability.json— source of truth- 4 generated MD views in
docs/std/anddocs/sds/traceability/
Phase 7: Validation & Finalization
Goal: Verify completeness and consistency.
Reference: See Validation Checklist for full checklist.
Validation Scripts:
# 1. Check for orphan SRS refs (mapped in code_tags.json but not defined in SRS)
jq -r '.by_srs | keys[]' tests/catalogue/code_tags.json | sort -u > /tmp/code-refs.txt
grep -roh "REQ-[A-Z]*-[0-9]*" docusaurus/docs/srs/*.md docusaurus/docs/srs/rules/ | sort -u > /tmp/srs-ids.txt
comm -23 /tmp/code-refs.txt /tmp/srs-ids.txt
# 2. Check for consistency between code_tags.json views
jq -r '.by_srs | to_entries[] | .value[]' tests/catalogue/code_tags.json | sort -u > /tmp/srs-files.txt
jq -r '.by_file | keys[]' tests/catalogue/code_tags.json | sort -u > /tmp/byfile-files.txt
comm -23 /tmp/srs-files.txt /tmp/byfile-files.txt # Should be empty
# 3. Check for empty entries in code_tags.json
jq '.by_file | to_entries[] | select(.value.srs == [] and .value.sdd == []) | .key' tests/catalogue/code_tags.json
Finalization Steps:
- Update version headers in all modified docs
- Update
CLAUDE.mdwith new version status - Create transition notes documenting any incomplete work
- Consider creating version tag:
git tag v{NEW}-docs-complete
7b. Release Artifacts
After validation passes, produce release-specific artifacts:
- Update VERSION constant in
docusaurus/scripts/generate-unified-traceability.py(line 64) to match v{NEW} - Create release notes:
docusaurus/docs/release-notes-v{NEW}.md— user-facing changelog summarizing new features, behavior changes, and known issues - Generate STR (Software Test Report):
python3 behat-optimizer.py report \
--traceability /path/to/unified-traceability.json \
--render-md /path/to/str-release-test-report.md - Freeze docs as versioned snapshot:
This creates
cd docusaurus && npm run docusaurus docs:version {NEW}versioned_docs/version-{NEW}/and updatesversions.json. - Verify
versions.jsonlists the new version correctly
Sprint Structure (Recommended)
| Sprint | Focus | Parallelizable | Agent Pattern |
|---|---|---|---|
| 0 | Phase 0: Confluence Ingestion (if needed) | Partially — by domain section | See orchestration guide |
| 0.5 | Phase 0.5: Code Upgrade | No | Single orchestrator (merge + migrate) |
| 1 | Phase 1-2: Change Detection + Impact Analysis | No | Single orchestrator |
| 2 | Phase 3: SRS Updates (domains 1-10) | Yes — by domain | One agent per domain |
| 3 | Phase 3: SRS Updates (domains 11-20+) | Yes — by domain | One agent per domain |
| 4 | Phase 4: SDS Updates | Partially — independent sections | One agent per section |
| 5 | Phase 5: STD Updates | Yes — by test plan | One agent per STD file |
| 5c | Phase 5c: Behat Test Implementation | Yes — wave-based | 8-10 per wave + QR pass |
| 5d | Phase 5d: Version Tagging | Yes — by feature file | One agent per split |
| 5e | Phase 5e: Regression Testing | Yes — by suite | 8-10 parallel (DB pool) |
| 6 | Phase 6: Code Mapping Sync | Yes — by code module | One agent per module |
| 7 | Phase 7: Validation + Finalization + Release | No | Sequential validation |
Exit Criteria
Version update is complete when:
- Code upgraded to v{NEW} and migrations applied (Phase 0.5)
-
.env.test-baselinecreated - Change Manifest created and reviewed
- All SRS files updated for changes
- All SDS files updated for changes
- All STD files updated for changes
- Behat tests created/updated for new/changed test vectors
- Version-split scenarios created where behavior differs between versions
- QR pass completed — all WRONG items fixed
- Full regression run with
--target-version— all failures classified - Stale KCI/KL tags audited and removed where bugs are fixed in v{NEW}
- Testing dashboard updated with new test counts
- Code mapping validation passes (no orphans in code_tags.json, views consistent)
- Unified traceability regenerated and verified
-
generate-unified-traceability.pyVERSION constant updated - Release notes created
- Docusaurus version freeze completed (
docs:version) - CLAUDE.md status updated
- Transition notes created for any incomplete work
- All validation scripts pass