Acceptance Test–Driven Development (ATDD)
Career Paths
How to interpret this table?
You may choose this advanced topic if you like doing the things listed under “they usually do”, and you are fine with not doing the things listed under “they usually do not do”.
Alternatively, you may choose it if you are interested in applying for the listed job roles and want to practice work that is close to those roles.
| Job title | They usually do | They usually do NOT do | Real-life examples |
|---|---|---|---|
| Software Engineer (ATDD-focused) | Define acceptance criteria before implementation, express them as executable tests, and implement code to satisfy those tests | Write acceptance tests after the fact, treat them as documentation only, or bypass failing acceptance tests | Feature-driven development with executable acceptance tests |
| Product / Quality Facilitator | Translate feature intent into clear acceptance criteria, keep acceptance tests readable and authoritative | Allow ambiguous requirements or hidden acceptance logic | Teams using BDD/ATDD-style workflows |
Affected SDLC Phases
If a team chooses this advanced topic, the planning, implementation, and testing phases are tightly coupled. Acceptance criteria are defined upfront and drive development. Acceptance tests act as the contract for feature completeness and correctness.
Affected Tasks
Features are defined
Minimum Viable Product (MVP)
By the end of this task, your team has defined features together with explicit acceptance criteria that can be turned into acceptance tests.
Technical Details
For each feature, the team must define acceptance criteria before implementation.
Acceptance criteria must be:
- User- or behavior-focused
- Unambiguous
- Testable
The team must state which acceptance testing approach is used (e.g. BDD-style scenarios, API-level acceptance tests, UI-level acceptance tests).
Quality
High-quality feature definitions make acceptance criteria precise enough to drive implementation without reinterpretation.
Features are sorted by priority
Minimum Viable Product (MVP)
Your team prioritizes features based on acceptance-critical value.
Technical Details
Features with unclear or risky acceptance criteria must be prioritized earlier to reduce uncertainty.
Quality
High-quality prioritization reduces late acceptance surprises.
Features' cost and price are estimated
Minimum Viable Product (MVP)
Your team estimates cost based on acceptance complexity.
Technical Details
Estimates must consider:
- Number and complexity of acceptance tests
- Test environment setup effort
- Maintenance of acceptance tests
Quality
High-quality estimates realistically account for acceptance-level testing effort.
System is working
Minimum Viable Product (MVP)
By the end of this task, your team demonstrates a working system by executing acceptance tests successfully.
Technical Details
The demo must show:
- Acceptance tests defined before implementation
- Acceptance tests failing before implementation or change
- Acceptance tests passing after implementation
Acceptance tests may be automated or semi-automated, but they must be executable and repeatable.
Quality
High-quality demos clearly show acceptance tests as the driver of development, not as an afterthought.
Bug fixing
Minimum Viable Product (MVP)
During development, your team reports and fixes one defect by first expressing it as a failing acceptance test.
Technical Details
The bug must be captured as:
- A new or updated acceptance test
- A failing test demonstrating the problem
The fix is complete only when the acceptance test passes.
Quality
High-quality bug fixing strengthens the acceptance test suite and prevents regression.
User documentation
Minimum Viable Product (MVP)
User documentation reflects accepted behavior.
Technical Details
Documentation must align with what is covered by acceptance tests.
Quality
High-quality documentation avoids describing untested or unaccepted behavior.
Developer documentation
Minimum Viable Product (MVP)
Developer documentation explains the ATDD approach used.
Technical Details
Documentation must describe:
- How acceptance criteria are defined
- How acceptance tests are written and run
- How tests drive implementation
Quality
High-quality documentation makes ATDD practice explicit and repeatable.