Scanner Runtime Readiness Checklist

Last updated: 2025-10-19

This runbook confirms that Scanner.WebService now surfaces the metadata Runtime Guild consumers requested: quieted finding counts in the signed report events and progress hints on the scan event stream. Follow the checklist before relying on these fields in production automation.


1. Prerequisites

  • Scanner.WebService release includes SCANNER-POLICY-09-107 (adds quieted provenance and score inputs to /reports).
  • Docs repository at commit containing docs/events/scanner.report.ready@1.json with quietedFindingCount.
  • Access to a Scanner environment (staging or sandbox) with an image capable of producing policy verdicts.

2. Verify quieted finding hints

  1. Trigger a report – run a scan that produces at least one quieted finding (policy with quiet: true). After the scan completes, call:
    POST /api/v1/reports
    Authorization: Bearer <token>
    Content-Type: application/json
    
    Ensure the JSON response contains report.summary.quieted and that the DSSE payload mirrors the same count.
  2. Check emitted event – pull the latest scanner.report.ready event (from the queue or sample capture). Confirm the payload includes:
    • quietedFindingCount equal to the summary.quieted value.
    • Updated summary block with the quieted counter.
  3. Schema validation – optionally validate the payload against docs/events/scanner.report.ready@1.json to guarantee downstream compatibility:
    npx ajv validate -c ajv-formats \
      -s docs/events/scanner.report.ready@1.json \
      -d <payload.json>
    
    (Use npm install --no-save ajv ajv-cli ajv-formats once per clone.)

Snapshot fixtures: see docs/events/samples/scanner.event.report.ready@1.sample.json for a canonical orchestrator event that already carries quietedFindingCount.


3. Verify progress hints (SSE / JSONL)

Scanner streams structured progress messages for each scan. The data map inside every frame carries the hints Runtime systems consume (force flag, client metadata, additional stage-specific attributes).

  1. Submit a scan with custom metadata (for example pipeline=github, build=1234).
  2. Stream events:
    GET /api/v1/scans/{scanId}/events?format=jsonl
    Authorization: Bearer <token>
    Accept: application/x-ndjson
    
  3. Confirm payload – each frame should resemble:
    {
      "scanId": "2f6c17f9b3f548e2a28b9c412f4d63f8",
      "sequence": 1,
      "state": "Pending",
      "message": "queued",
      "timestamp": "2025-10-19T03:12:45.118Z",
      "correlationId": "2f6c17f9b3f548e2a28b9c412f4d63f8:0001",
      "data": {
        "force": false,
        "meta.pipeline": "github"
      }
    }
    
    Subsequent frames include additional hints as analyzers progress (e.g., stage, meta.*, or analyzer-provided keys). Ensure newline-delimited JSON consumers preserve the data dictionary when forwarding to runtime dashboards.

The same frame structure is documented in docs/09_API_CLI_REFERENCE.md §2.6. Copy that snippet into integration tests to keep compatibility.


4. Sign-off matrix

StakeholderChecklistStatusNotes
Runtime GuildSections 2 & 3 completedCapture sample payloads for webhook regression tests.
Notify GuildquietedFindingCount consumed in notificationsUpdate templates after Runtime sign-off.
Docs GuildChecklist published & linked from updates2025-10-19

Mark the stakeholder boxes as each team completes its validation. Once all checks are green, update docs/TASKS.md to reflect task completion.