Actionable Rules for creating, applying, and evolving a high-quality Definition of Done within Agile/Scrum teams.
Transform your Agile delivery from reactive bug fixes to proactive quality engineering with systematic Definition of Done practices.
You've been there: Sprint Review demos crash mid-presentation. "Done" stories resurface as production bugs three sprints later. Your team debates whether incomplete work counts as "almost done" while technical debt compounds. Sound familiar?
The root issue isn't your team's skill—it's the absence of a bulletproof Definition of Done.
Most teams treat DoD as an afterthought checklist buried in confluence pages. But high-performing Agile teams weaponize their DoD as an automated quality gate that prevents broken work from ever reaching production.
These Cursor rules transform your Definition of Done from a static checklist into a living, automated quality control system that integrates directly into your development workflow.
You get practical templates, automated enforcement patterns, and team collaboration frameworks that eliminate the "is it really done?" conversations forever.
Key capabilities:
Before: 30% of "done" stories return as bugs within two sprints After: Sub-5% escape rate with automated DoD gates blocking incomplete work
Before: Team velocity drops 40% due to context-switching between new features and bug fixes After: Consistent velocity with 95%+ stories truly complete at Sprint Review
Before: Pull requests average 3+ review cycles catching basic quality issues After: First-pass approval rate above 80% with DoD checks catching issues pre-review
Before: Manual testing catches 60% of issues, rest become production debt After: Automated DoD pipeline catches 90%+ issues before merge
Instead of estimating work twice (initial story + inevitable bug fixes), your team estimates once with DoD confidence:
### Story: User Profile Dashboard
Estimate: 8 points (includes full DoD compliance)
DoD Checklist Auto-Generated:
✔ Unit tests ≥ 80% coverage (automated)
✔ Integration tests pass (automated)
✔ Performance budget met: <2s load time (automated)
✔ Security scan clean (automated)
✔ Documentation updated (manual)
✔ PO acceptance (manual)
Your standups become tactical instead of status updates:
Old standup: "Working on the dashboard feature, should be done soon" DoD-driven standup: "Dashboard feature: 4/6 DoD criteria complete, blocked on security scan - need DevSecOps support"
Demonstrate only work that's genuinely production-ready:
Sprint Review Agenda (Auto-Generated):
- Feature A: DoD ✓ - Ready for demo
- Feature B: DoD 5/6 (docs pending) - Not demoed
- Feature C: DoD ✓ - Ready for demo
DoD Compliance: 85% (Target: 90%)
# .github/workflows/dod-compliance.yml
name: Definition of Done Compliance
on: [pull_request]
jobs:
dod-check:
runs-on: ubuntu-latest
steps:
- name: Unit Test Coverage
run: npm test -- --coverage
env:
COVERAGE_THRESHOLD: 80
- name: Security Scan
uses: securecodewarrior/github-action-add-sarif@v1
- name: Performance Budget
run: lighthouse-ci --budget-path=.lighthouserc.json
- name: DoD Compliance Report
run: |
echo "✔ Unit tests: $COVERAGE_RESULT"
echo "✔ Security: $SECURITY_RESULT"
echo "✔ Performance: $PERF_RESULT"
Stop treating quality as negotiable. These DoD excellence rules transform your Agile practice from "ship and fix" to "build it right the first time."
Your Definition of Done becomes your team's quality DNA—automatically enforced, continuously improved, and genuinely respected by every team member.
Ready to eliminate the "is it really done?" debate forever? Implement these rules and watch your Sprint Reviews become celebration sessions instead of damage control meetings.
You are an expert in Agile delivery, Scrum, CI/CD, and quality engineering.
Key Principles
- Build the Definition of Done (DoD) collaboratively with the whole Scrum Team: developers, testers, product owner, Scrum Master, and key stakeholders.
- The DoD is a living artifact: review every Sprint Retrospective; version-control changes.
- Keep each criterion atomic, testable, and expressed as a checkbox (✔). Avoid vague verbs such as “appropriate” or “sufficient.”
- Differentiate artifacts:
- DoD = universal quality bar for every backlog item & increment.
- Acceptance Criteria = feature-specific functional expectations.
- Definition of Ready (DoR) = entry gate for starting work.
- Prioritise automation: anything that can be automatically verified (tests, linting, security scans) must be integrated into CI pipelines.
- Visibility is mandatory: post the current DoD on the team board, wiki, and Sprint/PI Planning decks.
Agile Terminology & Conventions
- Naming
- Use “Definition of Done” or “DoD” consistently—never mix case within the same artifact.
- Acceptance Criteria are listed under a “Criteria”/“Given-When-Then” heading inside each User Story.
- Style
- Phrase criteria in the past tense to reflect a completed state, e.g. “All unit tests have passed.”
- Order criteria from the cheapest feedback loop (static analysis) to the most expensive (manual exploratory testing).
- Example Minimal DoD Snippet
```markdown
### Definition of Done v3.2 (2024-06-19)
1. ✔ Code merged into `main` behind feature flag
2. ✔ 100 % automated unit tests pass & ≥ 80 % coverage
3. ✔ Static code analysis shows zero new critical issues (SonarQube gate)
4. ✔ Functional tests green in CI (Playwright)
5. ✔ Non-functional benchmarks meet SLA (p95 latency < 300 ms)
6. ✔ Documentation updated (README, API spec, release notes)
7. ✔ PO signs off in Jira
```
Error Handling and Validation
- Treat unclear or untestable DoD items as blockers; raise during Sprint Planning.
- Add a “Failsafe” clause: _"If any DoD step fails after merge, the owning squad must fix or revert within one business day."_
- Embed DoD checks in pull-request templates and CI pipelines so violations fail the build early.
- Track DoD breaches in Jira with the “DoD-Violation” label for root-cause analysis.
Scrum-Specific Rules
- Sprint Planning: confirm that every selected Product Backlog Item (PBI) can realistically satisfy the DoD within the Sprint.
- Daily Scrum: surface impediments that threaten completing any DoD criterion; Scrum Master removes blockers.
- Sprint Review: only work fulfilling the DoD is demoed; unfinished PBIs are returned to the backlog.
- Sprint Retrospective: inspect the DoD’s effectiveness; adjust wording, automation coverage, or acceptance thresholds.
Testing & Quality Section
- Unit Testing: ≥ 80 % coverage is the default DoD threshold; raise threshold by 5 % each quarter until 90 %.
- Integration & Contract Tests: executed in CI on every merge; failures block promotion to staging.
- Performance Tests: include p95 latency, throughput, and memory checks. Define numeric targets in the DoD.
- Security: mandate SAST & dependency-vulnerability scans; zero high-severity findings allowed at merge time.
- Manual QA: only exploratory/regression sessions, never rote checks already automated.
Performance & Reliability Section
- Include a performance budget in the DoD (e.g., "no page > 200 KB, Time-to-Interactive < 2 s").
- Add resilience criteria: graceful failure modes implemented; chaos test scripts run in staging.
Tooling Integration Rules
- Jira
- Create a “Definition of Done” global checklist and attach it to every Story & Bug issue type.
- Block transition to “Done” unless all checklist items are ticked.
- CI/CD (GitHub Actions, GitLab, Jenkins)
- Implement a `dod-check` composite job that executes lint, test, security, and performance gates.
- Green pipeline + repository tag `dod✓` = artifact eligible for deploy.
- Dashboards: expose DoD compliance rate; target ≥ 95 % per Sprint.
Governance & Compliance
- Keep audit logs of DoD modifications (who, what, when, why).
- Align DoD with org-level policies (e.g., ISO 27001, SOC 2). Map each control to a DoD line item.
Common Pitfalls & Counter-Measures
- Pitfall: “Everything is implicit.” Fix: make each assumption explicit in the DoD list.
- Pitfall: DoD too long to be achievable. Fix: time-box DoD steps to ≤ 20 % of Story effort.
- Pitfall: Treating DoD as optional. Fix: build enforcement into CI and workflow states.
Continuous Improvement Checklist (Quarterly)
- Survey team: Which DoD checks add little value? Remove or automate.
- Compare escaped defects vs. DoD breaches; strengthen weakest links.
- Benchmark pipeline duration; if > 20 min, parallelise tests or slice DoD into tiered gates.