Foundation
What the DoD Does
The Definition of Done is a formal, shared, non-negotiable list of quality criteria that every Product Backlog Item must meet before it is considered complete. It is the Scrum Team's commitment to quality — applied to every single item, every single sprint.
The DoD creates transparency: it makes the concept of "Done" unambiguous. Without it, different people have different standards — and teams accumulate hidden undone work that eventually surfaces as bugs, incidents, or missed features.
The DoD is non-negotiable. An item that doesn't meet the DoD cannot be presented at the Sprint Review and must return to the Product Backlog. No exceptions, no "good enough for now."
Foundation
DoD vs Acceptance Criteria
| Dimension | Definition of Done | Acceptance Criteria |
|---|---|---|
| Scope | Applies to ALL items in every sprint | Specific to one story |
| Who sets it | The Scrum Team (created once; evolves) | Product Owner + team (per story) |
| Content | Engineering quality standards | Business / functional requirements |
| Example | "Unit tests written and passing" | "User sees error if email is invalid" |
| Changes? | Evolves slowly (DoD gets stronger over time) | Changes per story |
An item must meet both the DoD AND its own Acceptance Criteria to be declared Done.
Examples
Product Team DoD (web application)
✓ Code reviewed by at least one other developer ✓ Unit tests written and passing (coverage ≥ 80% on new code) ✓ Integration tests written and passing ✓ No new linting / static analysis violations ✓ Accessibility: WCAG 2.1 AA for any UI changes ✓ No known critical or high-severity bugs introduced ✓ Feature flag configured if applicable ✓ Deployed to staging environment ✓ PO demonstrated working software; AC verified ✓ Monitoring / alerting updated if new endpoints added ✓ Documentation updated (API docs, README, runbook as applicable)
Examples
Mobile Team DoD (iOS / Android)
✓ Code reviewed and approved by at least one engineer ✓ Unit tests written and passing ✓ UI snapshot tests updated if layout changed ✓ Tested on minimum supported OS version (iOS 16+ / Android 10+) ✓ Tested on at least 2 physical device form factors ✓ No new memory leaks (profiler run) ✓ Accessibility: VoiceOver / TalkBack compatible for new UI ✓ Strings externalised (no hard-coded copy in code) ✓ Crash-free rate maintained (no regression in staging) ✓ Submitted to TestFlight / Internal Track; PO verified
Examples
Data / ML Team DoD
✓ Data pipeline tested with real + synthetic edge-case data ✓ Data quality checks implemented (null rates, schema validation) ✓ Model accuracy meets agreed threshold on holdout set ✓ Bias / fairness evaluation completed and documented ✓ Feature logic reviewed by another engineer + domain expert ✓ Data lineage documented ✓ Monitoring / data quality alerts configured ✓ Model card or feature documentation updated ✓ Rollback plan defined for model updates ✓ PO demonstrated output meets acceptance criteria
Examples
Platform / Infrastructure Team DoD
✓ Infrastructure changes in code (Terraform / Pulumi / Helm) ✓ Changes tested in non-production environment first ✓ Security review completed for any new IAM or network changes ✓ Runbook created or updated ✓ Monitoring dashboards and alerts updated ✓ Cost impact estimated and within budget guardrails ✓ Change communicated to affected teams ✓ Rollback procedure tested or documented ✓ On-call documentation updated
Structure
Three Levels of Done
In scaled environments, Done has multiple layers. Each layer builds on the one below.
| Level | Scope | What it means |
|---|---|---|
| Story Done | One PBI | Story meets DoD + its own ACs. Reviewed and accepted by PO. |
| Sprint Done | Sprint Increment | All stories meet DoD. Increment is integrated, tested, and potentially shippable. |
| Release Done | Product release | All release criteria met: load testing, security scan, release notes, stakeholder sign-off. |
In SAFe / LeSS: add a Program Increment Done level — all ARTs/teams integrated, System Demo completed, release train objectives achieved.
Structure
Building Your DoD
Step 1: Start with a team workshop (1–2 hours) → Ask: "What does it mean for work to truly be complete?" → Each person writes items on sticky notes → Group and deduplicate → Vote on must-haves vs nice-to-haves Step 2: Write it up and post it visibly → Physical board: laminated card next to the Kanban board → Digital: pinned in the team's project tool and Slack channel → Every developer must know it without looking it up Step 3: Apply it immediately → Next sprint: every item is measured against it → First week will be hard — that's the point Step 4: Strengthen it each retro → Question: "Did the DoD prevent any issues this sprint?" → Question: "What quality problems emerged that the DoD would have caught?" → Add items when gaps are found; never remove without team agreement
Structure
DoD at Scale
Multiple teams on one product (LeSS / SAFe): → One shared DoD for the whole product (minimum standard) → Teams can ADD to the shared DoD; cannot remove items → "Sprint Done" must include cross-team integration → System Demo (SAFe) / Sprint Review Bazaar (LeSS) validates combined increment Component teams vs feature teams: → Component teams need additional DoD items for interfaces and contracts → Feature teams use the standard product DoD DoD maturity progression: Month 1 → Basic: code review + unit tests + PO accepted Month 3 → Add: integration tests + staging deploy + no critical bugs Month 6 → Add: performance baseline + security scan + monitoring Year 1+ → Add: chaos engineering, load testing, accessibility audits
Reference
Anti-Patterns
| Anti-Pattern | Problem | Fix |
|---|---|---|
| DoD bypassed under pressure | "Just this once" destroys the standard; becomes the norm | DoD is non-negotiable. Reduce scope instead. |
| DoD not visible | Team forgets items; inconsistent application | Post physically + digitally. Reference in every sprint review. |
| DoD as manager's checklist | Developers feel policed, not empowered | Team owns the DoD. It is their standard, not management's. |
| Acceptance criteria confused with DoD | Story-specific ACs treated as universal quality standard | DoD = universal; ACs = story-specific. Both required. |
| DoD never strengthened | Team stops improving; debt accumulates in gaps | Review and strengthen the DoD at every retrospective |
| DoD too aspirational at start | Team fails every item; morale drops; ignored | Start with 4–5 achievable items; add progressively |
| No shared DoD at scale | Teams have different standards; integration reveals gaps | One shared DoD for the product; teams can only add, not remove |
Reference
DoD Cheat Sheet
What the DoD is → Formal, shared quality criteria applied to EVERY item, EVERY sprint → Non-negotiable — no exceptions, no "good enough" → Owned by the team; strengthened at retrospectives DoD vs Acceptance Criteria DoD → applies to all items; engineering quality standard ACs → story-specific; functional requirements Minimum viable DoD (start here) ✓ Code reviewed by at least one other developer ✓ Unit tests written and passing ✓ Deployed to staging / test environment ✓ PO verified against acceptance criteria ✓ No known critical bugs introduced Levels of Done Story Done → DoD + ACs met; PO accepted Sprint Done → All stories done; Increment integrated + shippable Release Done → Load test, security, release notes, stakeholder sign-off Golden rule When in doubt: reduce scope, not quality. Never ship undone work. Never skip the DoD.