Session 2.4 – High-Level Design Verification
Module 2: Verification & Validation | Duration: 1 hour
Learning Objectives
- Verify that architecture views (context, container, component) are consistent and traceable to requirements.
- Review interface contracts and data flows for completeness and error handling.
- Validate performance and capacity assumptions before implementation.
Architecture Consistency
A high-level design typically includes multiple views. During verification we confirm that each view is aligned and that cross-cutting concerns (security, observability, deployment) are addressed.
Context view
External actors, upstream/downstream dependencies, trust boundaries.
External actors, upstream/downstream dependencies, trust boundaries.
Container view
Services, databases, queues, and their protocols.
Services, databases, queues, and their protocols.
Component view
Internal module responsibilities, data stores vs caches, domain decomposition.
Internal module responsibilities, data stores vs caches, domain decomposition.
Cross-cutting policies
Authentication/authorization, logging, deployment topology.
Authentication/authorization, logging, deployment topology.
Consistency checks
- Each requirement from session 2.3 maps to at least one container/component.
- No orphan components; every module has an owner and deployment target.
- Non-functional constraints (latency, availability) appear in the relevant view.
Interfaces & Data
Interfaces are verified through contracts and data models.
Contracts
OpenAPI/GraphQL schemas, gRPC protos, message formats. Verify versioning strategy and backward compatibility.
OpenAPI/GraphQL schemas, gRPC protos, message formats. Verify versioning strategy and backward compatibility.
Data lineage
Trace fields from source to downstream consumption. Note transformations, encryption, retention.
Trace fields from source to downstream consumption. Note transformations, encryption, retention.
Error handling
Enumerate failure codes, retry policy, idempotency guarantees.
Enumerate failure codes, retry policy, idempotency guarantees.
Integration checklist
Timeout agreements, circuit breakers, observability hooks.
Timeout agreements, circuit breakers, observability hooks.
Performance & Capacity Assumptions
Before coding starts we validate the math behind our scaling assumptions.
- Throughput model (requests/sec, users per region).
- Data growth (storage per day, retention policy, archiving plan).
- Latency budget per hop; include external dependencies.
- Failure scenarios (zone outage, dependency slowdown) and fallbacks.
Capacity review example: “Checkout service expects 3k RPS during sale; each instance handles 500 RPS with 30 ms processing → need 6 instances + 2 buffer. Document how autoscaling policy enforces this.”
Scenario: Architecture Review for Event Streaming Platform
Design summary: Ingest service writes to Kafka, stream processor enriches events, results pushed to REST API for partners.
Verification checklist outcome
- Context diagram missing regulatory data consumer → added compliance reporting queue.
- Interface review flagged inconsistent schema versioning; introduced contract test suite.
- Performance assumption (5 MB/s) underestimated—load testing models revised to 12 MB/s with autoscaling on CPU+lag metrics.
Checklists & Artifacts
Architecture review checklist
- All external dependencies listed with SLA/SLO.
- Security zones labeled; data classification aligned.
- Deployment topology captures redundancy and failover.
Interface review artifacts
- OpenAPI spec + example payloads.
- Consumer/provider contract tests in CI.
- Error catalog and monitoring dashboards.
Performance review evidence
- Capacity spreadsheet with assumptions.
- Architecture decision record (ADR) for scaling strategy.
- Risk log with mitigation plan.
Summary & Assignment
High-level design verification ensures the architecture can deliver on requirements before code is written. By checking consistency, interfaces, and performance assumptions now, we de-risk downstream validation.
Assignment: Take the architecture from your project and run a mini design review. Produce a checklist, log at least three findings (consistency, interface, performance), and update the traceability matrix linking requirements to design components.