After 13 years in quality engineering, I have seen QA teams at every stage of maturity — from single manual testers doing regression on spreadsheets to fully automated, AI-assisted quality pipelines that deploy 20 times a day with zero human sign-off. The gap between these teams is not budget or headcount. It is strategy, culture, and a clear understanding of where to invest next.

This is the maturity model I use to assess QA teams and plan their evolution.

Level 1 — Reactive Quality (Ad-hoc)

What it looks like: Testing happens after development, often under time pressure. There is no defined test process. Tests are manual, undocumented, and inconsistent. Bugs are found — and often re-found — because there is no regression safety net. Deployments are scary.

Key indicators: No test documentation, all testing manual, same bugs recurring, no metrics, testing is a bottleneck before every release.

Priority investment: Document critical test cases. Start tracking bug counts and sources. Introduce a basic regression checklist.

Level 2 — Defined Quality (Structured)

What it looks like: Testing is planned and documented. Test cases exist in a test management tool. The team runs structured regression cycles. Some basic automation exists, but it is fragile and not integrated into the pipeline. Quality is still mostly a post-development activity.

Key indicators: Test management tool in use, manual regression documented, some automation exists, basic metrics tracked.

Priority investment: Establish a stable automation framework (Playwright + TypeScript). Integrate automation into CI. Define coverage targets.

Level 3 — Managed Quality (Automated)

What it looks like: A solid automation framework exists and is integrated into CI/CD. Unit, integration, and E2E tests run automatically on every commit. Coverage is tracked and maintained. Quality gates exist in the pipeline. Deployments are gated on test results. The team spends more time on strategy than execution.

Key indicators: Automation in CI, quality gates enforced, coverage metrics tracked, deployment frequency increasing, defect escape rate declining.

Priority investment: Shift left — introduce Three Amigos, testability standards, and requirement quality reviews.

Level 4 — Proactive Quality (Engineered)

What it looks like: Quality is embedded in every phase of the lifecycle. QA engineers participate in product inception, design reviews, and architecture discussions. Testability is a design requirement. The pipeline runs in under 15 minutes with full coverage. Performance and security testing are automated. Production monitoring provides continuous quality feedback.

Key indicators: QA at inception, testability in design standards, pipeline under 15 minutes, production monitoring active, quality metrics visible to leadership.

Priority investment: AI tooling adoption, shift-right monitoring, and knowledge sharing across the engineering organisation.

Level 5 — Optimised Quality (Autonomous)

What it looks like: Quality is a cultural value, not a process. AI tools generate and maintain tests. Self-healing selectors reduce maintenance overhead. Production monitoring feeds automatically into test coverage. Deployment frequency is high, defect escape rate is near zero, and the team's quality investment continuously compounds. QA engineers focus almost entirely on strategy, tooling evolution, and quality architecture.

Key indicators: AI-assisted testing in use, self-healing automation, production quality feedback loops active, team focuses on strategy not execution.

Honest assessment: Most engineering teams I encounter are at Level 2 or Level 3. Reaching Level 4 is a meaningful achievement that requires sustained investment over 12–18 months. Level 5 is aspirational for most — but knowing it exists helps you plan the journey.

How to Use This Model

Use this model as a conversation starter with your engineering leadership, not as a performance scorecard. The goal is not to judge where your team is — it is to create a shared understanding of where you are, where you want to go, and what investment that journey requires.

Run a simple assessment with your team: for each level, list three specific evidence points that would confirm you are operating at that level. Then honestly evaluate which level matches your current reality. The gap between your current level and your next level is your quality roadmap.

Final thought: The teams that reach Level 4 and 5 are not the ones with the biggest QA budgets. They are the ones where engineering leadership genuinely believes quality is a strategic advantage — and invests accordingly.

// Key Takeaways