agile / agile-maturity · v1.0

Agile Maturity Model

Team self-assessment frameworks from ad-hoc to high-performing — how to measure where you are, and what to work on next.

5
Maturity levels
8
Assessment dimensions
1
Team per session
Improvement ceiling

What Maturity Models Do

An agile maturity model is a tool for structured self-reflection — not a grading system. It helps teams understand where they are in their agile journey, identify the most impactful improvements, and create a shared language for discussing team health.

The model covers process, engineering practices, culture, and delivery quality. A team that scores well on all dimensions has usually earned it through genuine improvement — not through coaching for the assessment.

Maturity models are for the team, not for management. The moment results are reported upward as KPIs, teams will optimise for the score, not for genuine improvement. Keep results internal.

The 5 Maturity Levels

Level 1
Ad-hoc / Chaotic
No consistent process; heroics dominate
Sprints exist in name only. No Definition of Done. Retrospectives skipped. Releases are high-risk, manual events. The team firefights constantly. Estimates are meaningless. Leadership pressure drives all decisions.
Level 2
Repeatable
Basic ceremonies in place; results variable
Sprint cadence established. PO, SM, and team roles filled. Sprint Planning and Retrospectives happen consistently. Definition of Done exists but has gaps. Velocity is tracked but not used wisely. Releases still stressful but scheduled. Dependencies managed ad hoc.
Level 3
Defined
Consistent practices; improving predictability
Strong DoD enforced. Backlog refinement happens weekly. Sprint Goals are meaningful and met 70%+ of the time. CI/CD pipeline exists; deploys are relatively safe. Code reviews are standard. Team retrospectives produce committed improvements. Cycle time tracked. Escaped defect rate declining.
Level 4
Managed
Data-driven decisions; high predictability
DORA metrics measured and improving. Lead time <1 week. Change failure rate <10%. Flow metrics (cycle time, throughput) used for forecasting. Team self-organises on all technical decisions. PO and team deeply aligned. Stakeholder trust is high. Continuous deployment or very frequent releases. Blameless post-mortems standard.
Level 5
Optimising
Continuous innovation; industry-leading delivery
Elite DORA performance. Multiple production deploys per day. Team drives its own improvement agenda. Engineers contribute to organisational-level improvements. Psychological safety measured and managed. Tech debt proactively managed. Team contributes to hiring and culture. Innovative practices regularly piloted and shared externally.

8 Assessment Dimensions

DimensionWhat it measuresKey indicators
Process & CeremoniesAre Agile events run effectively?Sprint Goals met, retros producing actions, DoD enforced
Product & BacklogIs the backlog healthy and prioritised?Refined PBIs, clear ACs, PO accessible and empowered
Delivery & PredictabilityDoes the team deliver reliably?Sprint goal % met, velocity stability, release frequency
Engineering PracticesIs the codebase maintainable and safe?Test coverage, CI/CD, code review quality, tech debt level
DevOps & DeploymentHow fast and safe are deployments?DORA metrics, deployment frequency, MTTR
Collaboration & Team HealthDoes the team work well together?Psychological safety, knowledge sharing, cross-skilling
Stakeholder AlignmentIs there trust between team and business?Sprint Review feedback quality, stakeholder NPS, PO relationship
Continuous ImprovementDoes the team get better every sprint?Retro action completion rate, learning culture, experimentation

Running a Self-Assessment

Preparation (before the session)
→ Share the dimensions and levels with the team 1 week ahead
→ Ask everyone to individually score each dimension 1–5
→ 45–90 min session; whole team including PO and SM

Session format
0–10 min   → context: "This is for us, not for management"
10–30 min  → individual scoring (silent; sticky notes or digital tool)
30–60 min  → reveal scores per dimension; discuss gaps >1 point
60–80 min  → agree on top 3 dimensions to improve
80–90 min  → assign owners and timebox next actions

Scoring per dimension
1 = Level 1 behaviours dominate
2 = Mostly Level 2; some Level 3 behaviours starting
3 = Solidly Level 3; occasional Level 4 moments
4 = Consistently Level 4
5 = Level 5; high-performing across this dimension
Run the assessment every quarter. Compare scores over time — trend matters more than absolute score. A team improving from 2→3 across all dimensions is outperforming a static team stuck at 4.

Improvement Roadmap

From Level 1 → 2 (first 30 days)
→ Establish consistent sprint cadence
→ Fill all three Scrum roles
→ Create a basic Definition of Done (5–6 items)
→ Hold Sprint Planning and Retrospectives every sprint

From Level 2 → 3 (days 30–90)
→ Make Sprint Goals specific and measurable
→ Enforce the DoD — return undone work to backlog
→ Start weekly backlog refinement sessions
→ Instrument CI/CD and automate basic test suite
→ Track and action every Retrospective item

From Level 3 → 4 (months 3–9)
→ Implement DORA metric tracking
→ Move to continuous deployment or high-frequency releases
→ Use flow metrics (cycle time, throughput) for forecasting
→ Introduce blameless post-mortems for all P1/P2 incidents
→ Allocate 15–20% capacity for tech debt

From Level 4 → 5 (months 9–18+)
→ Multiple production deploys per day
→ Team drives its own improvement agenda; external coaching optional
→ Measure and actively manage psychological safety
→ Contribute innovations to the broader organisation

Anti-Patterns

Anti-PatternProblemFix
Maturity as management KPITeams inflate scores; assessment loses valueResults stay with the team; management sees only improvement actions
Assessing quarterly without actingSame scores every quarter; no improvementAssessment must produce 3 actions with owners and timebox
Targeting level 5 immediatelyTeam skips foundational practices; collapsesProgress level by level; don't skip foundations
External assessor onlyTeam doesn't own the results; feels evaluatedTeam self-assesses; external coach facilitates if needed
Process maturity without outcomesLevel 3 process, Level 1 delivery resultsOutcomes (DORA, customer satisfaction) must improve with maturity

Maturity Cheat Sheet

5 Levels
1 — Ad-hoc: no consistent process; heroics
2 — Repeatable: ceremonies in place; variable results
3 — Defined: consistent practices; improving predictability
4 — Managed: data-driven; high predictability; strong DORA
5 — Optimising: elite delivery; team drives own improvement

8 Dimensions
Process & Ceremonies · Product & Backlog · Delivery & Predictability
Engineering Practices · DevOps & Deployment · Collaboration & Team Health
Stakeholder Alignment · Continuous Improvement

Assessment rules
→ Self-assessment by the team; not a management report
→ Every dimension scored 1–5 individually, then discussed
→ Top 3 gaps → improvement actions with owners
→ Run quarterly; track trends, not just scores

Quick level checks
Level 2+: Are ceremonies happening consistently?
Level 3+: Is the DoD enforced? Are Sprint Goals met 70%+?
Level 4+: Are DORA metrics measured and improving?
Level 5+: Is the team driving innovation beyond its own backlog?