agile / agile-dod · v1.0

Definition of Done

Building a strong DoD — what it must include, how it differs by team type, common anti-patterns, and ready-to-use examples by context.

3
DoD levels
4
Team type examples
0
Exceptions (ever)
7
Anti-patterns

What the DoD Does

The Definition of Done is a formal, shared, non-negotiable list of quality criteria that every Product Backlog Item must meet before it is considered complete. It is the Scrum Team's commitment to quality — applied to every single item, every single sprint.

The DoD creates transparency: it makes the concept of "Done" unambiguous. Without it, different people have different standards — and teams accumulate hidden undone work that eventually surfaces as bugs, incidents, or missed features.

The DoD is non-negotiable. An item that doesn't meet the DoD cannot be presented at the Sprint Review and must return to the Product Backlog. No exceptions, no "good enough for now."

DoD vs Acceptance Criteria

DimensionDefinition of DoneAcceptance Criteria
ScopeApplies to ALL items in every sprintSpecific to one story
Who sets itThe Scrum Team (created once; evolves)Product Owner + team (per story)
ContentEngineering quality standardsBusiness / functional requirements
Example"Unit tests written and passing""User sees error if email is invalid"
Changes?Evolves slowly (DoD gets stronger over time)Changes per story
An item must meet both the DoD AND its own Acceptance Criteria to be declared Done.

Product Team DoD (web application)

✓ Code reviewed by at least one other developer
✓ Unit tests written and passing (coverage ≥ 80% on new code)
✓ Integration tests written and passing
✓ No new linting / static analysis violations
✓ Accessibility: WCAG 2.1 AA for any UI changes
✓ No known critical or high-severity bugs introduced
✓ Feature flag configured if applicable
✓ Deployed to staging environment
✓ PO demonstrated working software; AC verified
✓ Monitoring / alerting updated if new endpoints added
✓ Documentation updated (API docs, README, runbook as applicable)

Mobile Team DoD (iOS / Android)

✓ Code reviewed and approved by at least one engineer
✓ Unit tests written and passing
✓ UI snapshot tests updated if layout changed
✓ Tested on minimum supported OS version (iOS 16+ / Android 10+)
✓ Tested on at least 2 physical device form factors
✓ No new memory leaks (profiler run)
✓ Accessibility: VoiceOver / TalkBack compatible for new UI
✓ Strings externalised (no hard-coded copy in code)
✓ Crash-free rate maintained (no regression in staging)
✓ Submitted to TestFlight / Internal Track; PO verified

Data / ML Team DoD

✓ Data pipeline tested with real + synthetic edge-case data
✓ Data quality checks implemented (null rates, schema validation)
✓ Model accuracy meets agreed threshold on holdout set
✓ Bias / fairness evaluation completed and documented
✓ Feature logic reviewed by another engineer + domain expert
✓ Data lineage documented
✓ Monitoring / data quality alerts configured
✓ Model card or feature documentation updated
✓ Rollback plan defined for model updates
✓ PO demonstrated output meets acceptance criteria

Platform / Infrastructure Team DoD

✓ Infrastructure changes in code (Terraform / Pulumi / Helm)
✓ Changes tested in non-production environment first
✓ Security review completed for any new IAM or network changes
✓ Runbook created or updated
✓ Monitoring dashboards and alerts updated
✓ Cost impact estimated and within budget guardrails
✓ Change communicated to affected teams
✓ Rollback procedure tested or documented
✓ On-call documentation updated

Three Levels of Done

In scaled environments, Done has multiple layers. Each layer builds on the one below.

LevelScopeWhat it means
Story DoneOne PBIStory meets DoD + its own ACs. Reviewed and accepted by PO.
Sprint DoneSprint IncrementAll stories meet DoD. Increment is integrated, tested, and potentially shippable.
Release DoneProduct releaseAll release criteria met: load testing, security scan, release notes, stakeholder sign-off.
In SAFe / LeSS: add a Program Increment Done level — all ARTs/teams integrated, System Demo completed, release train objectives achieved.

Building Your DoD

Step 1: Start with a team workshop (1–2 hours)
→ Ask: "What does it mean for work to truly be complete?"
→ Each person writes items on sticky notes
→ Group and deduplicate
→ Vote on must-haves vs nice-to-haves

Step 2: Write it up and post it visibly
→ Physical board: laminated card next to the Kanban board
→ Digital: pinned in the team's project tool and Slack channel
→ Every developer must know it without looking it up

Step 3: Apply it immediately
→ Next sprint: every item is measured against it
→ First week will be hard — that's the point

Step 4: Strengthen it each retro
→ Question: "Did the DoD prevent any issues this sprint?"
→ Question: "What quality problems emerged that the DoD would have caught?"
→ Add items when gaps are found; never remove without team agreement

DoD at Scale

Multiple teams on one product (LeSS / SAFe):
→ One shared DoD for the whole product (minimum standard)
→ Teams can ADD to the shared DoD; cannot remove items
→ "Sprint Done" must include cross-team integration
→ System Demo (SAFe) / Sprint Review Bazaar (LeSS) validates combined increment

Component teams vs feature teams:
→ Component teams need additional DoD items for interfaces and contracts
→ Feature teams use the standard product DoD

DoD maturity progression:
Month 1  → Basic: code review + unit tests + PO accepted
Month 3  → Add: integration tests + staging deploy + no critical bugs
Month 6  → Add: performance baseline + security scan + monitoring
Year 1+  → Add: chaos engineering, load testing, accessibility audits

Anti-Patterns

Anti-PatternProblemFix
DoD bypassed under pressure"Just this once" destroys the standard; becomes the normDoD is non-negotiable. Reduce scope instead.
DoD not visibleTeam forgets items; inconsistent applicationPost physically + digitally. Reference in every sprint review.
DoD as manager's checklistDevelopers feel policed, not empoweredTeam owns the DoD. It is their standard, not management's.
Acceptance criteria confused with DoDStory-specific ACs treated as universal quality standardDoD = universal; ACs = story-specific. Both required.
DoD never strengthenedTeam stops improving; debt accumulates in gapsReview and strengthen the DoD at every retrospective
DoD too aspirational at startTeam fails every item; morale drops; ignoredStart with 4–5 achievable items; add progressively
No shared DoD at scaleTeams have different standards; integration reveals gapsOne shared DoD for the product; teams can only add, not remove

DoD Cheat Sheet

What the DoD is
→ Formal, shared quality criteria applied to EVERY item, EVERY sprint
→ Non-negotiable — no exceptions, no "good enough"
→ Owned by the team; strengthened at retrospectives

DoD vs Acceptance Criteria
DoD  → applies to all items; engineering quality standard
ACs  → story-specific; functional requirements

Minimum viable DoD (start here)
✓ Code reviewed by at least one other developer
✓ Unit tests written and passing
✓ Deployed to staging / test environment
✓ PO verified against acceptance criteria
✓ No known critical bugs introduced

Levels of Done
Story Done    → DoD + ACs met; PO accepted
Sprint Done   → All stories done; Increment integrated + shippable
Release Done  → Load test, security, release notes, stakeholder sign-off

Golden rule
When in doubt: reduce scope, not quality.
Never ship undone work. Never skip the DoD.