Goal: Repeatable quality through automated tests!
But we need people to develop them. Traditional organizations call these people testers.
Tester-> Automated Tests!
In trad. organizations, testers rarely do this because of the traditions of Waterfall.
Tester-> Test Plan -> Automated Tests!
And this is where the trouble starts. Due to the divide-and-conquer and handoff approach of Waterfall, it made sense to split the role of software development into programmer and tester (I'll not debate the pros and cons of doing this, Agile comic SCRUM NOIR—A Silo to Hell! does a nice job.) but in the Agile context, these testers are challenged in integrating their work with a Scrum team: they can't write automated tests until late in the Sprint. This is a big problem when moving to a practice such as Acceptance Test Driven Development where automated test development starts on the first day of the Sprint.
Iterative Testing Problems:
This habit is antithetical to the Scrum process since a Scrum team is doing incremental delivery of product and doesn't know until Sprint Planning what will be done during the next Sprint. This forces the Testers to produce Test Plans under immense stress if they preserve old habits.
This results in:
If you change your process without changing how you do things then nothing's going to change (except for the labels). So points 3 and 5 are natural to "storming" (Tuckman model) and must be allowed to happen. Once we get to later steps, "norming" and "performing," and do it say in 3 Sprints, the dangers of the change rejection (points 1 and 2) will go away. Point 4 is a deep culture change which takes time until people to understand the Agile Manifesto Values and Principles, namely, Working Software is the primary measure of progress (automated tests are working software) rather than process checkpoints, milestones, and documentation completion.
No matter what, an organization in transition will have storming and that is healthy if conflicts are allowed to be exposed AND resolved (healthy stress/conflict drives change). To help "storm well," here are some activities.
How can one stick get results as good as multipage test plans? We're leveraging the fact (and would be foolish not to) that within a few days we'll be building small features, so we don't need to document so heavily.
Agile Context (and why Agile development really is different than Waterfall):
To do so I had to:
The Process dimension is further broken down into 3 types:
Strangely enough, the feedback I've received from teams doing 4D Analysis is that despite the single sheet of paper, the amount of analysis is greater than they were used to from their traditional multipage test plan. Customize the 4D analysis to fit your team's needs.
examples:
|playlist|
|Never Played|
|My Top Rated|
|Whole Library|
More BDD articles:Well written BEHAVIOR Driven Development Scenarios, BDD Practices that Maximize Team Collaboration and Reduce Risk.
Lightweight test plans can be created before the Sprint by leveraging Backlog Grooming. Here's an agenda that broke backlog grooming up into the following: 30 minute kickoff, offline (outside of meeting) collaborative work, 1 hour grooming meeting.
Here's an example calendar for a three week Sprint:
Constraints:
During Sprint Planning, the team will adjust further and are expected to refine further even during the Sprint.
But we need people to develop them.
Ideal:
Tester->Automated Tests!
Usually, teams do at least a lightweight Test Plan:
Tester-> Acceptance Criteria-> Automated Tests
Tester-> BDD-> Automated Tests
Tester-> 4D Analysis -> Acceptance Criteria -> Automated Tests
Some teams need more analysis. Find a way to do it using only 10% of your current Sprint to plan for you next Sprint:
Tester->Test Plan->{Acceptance Criteria, BDD, 4D Analysis}-> Automated Tests
Remember you're doing Test Plans to have automated test cases to defend your product from regression. Your Test Plans should be designed to serve this purpose. Having large Test Plan documents was never the goal. If you can produce 2-6 automated test cases from a one page Test Plan, and spend no more than 10% of your Sprint doing Test Plans, then you're on the right track.
But we need people to develop them. Traditional organizations call these people testers.
Tester-> Automated Tests!
In trad. organizations, testers rarely do this because of the traditions of Waterfall.
Tester-> Test Plan -> Automated Tests!
And this is where the trouble starts. Due to the divide-and-conquer and handoff approach of Waterfall, it made sense to split the role of software development into programmer and tester (I'll not debate the pros and cons of doing this, Agile comic SCRUM NOIR—A Silo to Hell! does a nice job.) but in the Agile context, these testers are challenged in integrating their work with a Scrum team: they can't write automated tests until late in the Sprint. This is a big problem when moving to a practice such as Acceptance Test Driven Development where automated test development starts on the first day of the Sprint.
Iterative Testing Problems:
- Testers don't plan together with the developers during Sprint planning.
- Testers never check-in an automated test on the first day of the Sprint.
- Test Planning isn't done continuously (a little bit here, a little bit there, and then executed, then repeat) but instead is an upfront event that takes up 50% or more of the Sprint.
- Testers complain they don't have enough information to get started.
This habit is antithetical to the Scrum process since a Scrum team is doing incremental delivery of product and doesn't know until Sprint Planning what will be done during the next Sprint. This forces the Testers to produce Test Plans under immense stress if they preserve old habits.
This results in:
- Testers hating Agile.
- The Test part of the organization hates Agile and starts working against change initiatives.
- Testers complain that they have to drive everything because the tester's need becomes the event rather than a scheduled milestone.
- Many preconditions to creating a Test Plan: Testers demanding a lot of input documentation from other roles (design specs, arch specs, use cases, business case studies, hardware diagrams, call-flow, process flow,...) before they feel they can complete their Test Plan.
- Equating Incremental testing with incomplete testing: Tester's consider their Test Plans incomplete because they are only supposed to think of testing the functionality they are delivering that Sprint rather than for the entire release.
If you change your process without changing how you do things then nothing's going to change (except for the labels). So points 3 and 5 are natural to "storming" (Tuckman model) and must be allowed to happen. Once we get to later steps, "norming" and "performing," and do it say in 3 Sprints, the dangers of the change rejection (points 1 and 2) will go away. Point 4 is a deep culture change which takes time until people to understand the Agile Manifesto Values and Principles, namely, Working Software is the primary measure of progress (automated tests are working software) rather than process checkpoints, milestones, and documentation completion.
No matter what, an organization in transition will have storming and that is healthy if conflicts are allowed to be exposed AND resolved (healthy stress/conflict drives change). To help "storm well," here are some activities.
How to Agile your Test Plan
The Scrum team containing programmers and testers must re-invent how they work together and be open to new ideas and ditch some bad ones (the heavy weight, comprehensive test plan). Each story needs it's own Test Plan. Look at solving this problem at two levels: Test Plan format and timing.Making the format you already have, lightweight (format)
Rather than introduce any additional process/methodologies, make what you already do lightweight. If your usual Test Plans are 10-20 pages for features completed during a multi-month release, try and do the same plan but on one sticky for one User Story.How can one stick get results as good as multipage test plans? We're leveraging the fact (and would be foolish not to) that within a few days we'll be building small features, so we don't need to document so heavily.
Agile Context (and why Agile development really is different than Waterfall):
- a lot of information is successfully retained in our tacit knowledge since we are working on a small team that interacts daily,
- we are dedicated to one sprint backlog and become experts in its execution,
- we'll start acting on our plan within days,
- we're working on sub features and only need to test the sub features we are doing this Sprint, and
- each User Story is independent of the others so each must be tested independently
- the majority (if not all) of the tests will be automated and checked in and will be our best documentation and reporting system
Limber the mind
To change the results of your work you need to change yourself. A lot of what stops us are habits learned during the Waterfall context which need to be shed so we can develop sensible ones for the Agile context. This problem affects anyone when changing work contexts: I'm a science fiction author who spends hours getting my words right who sometimes spends hours getting an email right. This is complete waste for a one-off email. I had to learn that when I write technical planning documents or emails to reel myself back since prose isn't necessary. It took conscious effort to shift gears and doing pair work with another helped.To do so I had to:
- decide it was important to change,
- be open to new ideas, in fact, be open to trying something so crazy it couldn't possibly work and then go for it.
Acceptance Criteria (format)
Acceptance Criteria are the most lightweight and commonly used of all Test Plan formats: a simple bullet list of what the application should do before the PO will accept the User Story.
Acceptance Criteria can also take the format of wire frames, call flows, non functional requirements, ....
4D Analysis (format)
4D Analysis was introduced to one of my teams by friend and fellow coach XuYi. 4D adds three dimensions of additional analysis to simply using Acceptance Criteria. Adding more analysis isn't necessarily a good thing because that's what got us the 20 page Test Plan. However 4D is still one page and maybe your team'll find working on the first 3Ds helpful before getting to the Acceptance Criteria. The 4D analysis is attached behind each user story.The Process dimension is further broken down into 3 types:
- User workflow (how it works from a User's perspective)
- Business process (how it works from a Business Analysis perspective)
- Technical flow (system engineer or architecture view)
Strangely enough, the feedback I've received from teams doing 4D Analysis is that despite the single sheet of paper, the amount of analysis is greater than they were used to from their traditional multipage test plan. Customize the 4D analysis to fit your team's needs.
BDD (process, format, and technology)
Get your PO adding Behavior Driven Development scenarios to your User Stories. Or at a minimum work with your PO and get them written down. Good BDD tests make wonderful test plans AND automated Tests AND traceability documents!Given
there exists bad entries in [playlist]
When trying to play this playlist
Then remove invalid playlist entries
examples:
|playlist|
|Never Played|
|My Top Rated|
|Whole Library|
More BDD articles:Well written BEHAVIOR Driven Development Scenarios, BDD Practices that Maximize Team Collaboration and Reduce Risk.
Timing
When should Test Plans be created? The two strategies are:- Create Test plans during the Sprint
- Create Test plans before Sprint Planning.
Lightweight test plans can be created before the Sprint by leveraging Backlog Grooming. Here's an agenda that broke backlog grooming up into the following: 30 minute kickoff, offline (outside of meeting) collaborative work, 1 hour grooming meeting.
Here's an example calendar for a three week Sprint:
MTWTF MTWTF MTWTF ^- Kickoff Preperation for Grooming ^- Groom results for 1 hour
Constraints:
- a team spends no more than 10% of a Sprint preparing for the next Sprint
- other than Sprint Demo and Retro, avoid meetings on last 2 days during Sprint crunch time
- have Grooming far enough in advance of Sprint Planning to fix problems uncovered in Grooming
- During meeting
- PO brings proposed Sprint Backlog
- Team members select stories to prep for grooming and are encouraged to collaborate with other team members
- Decide what stories need SME deliverables/support and make that status visible
- Offline, to be finished by Grooming Meeting start (Monday 2PM)
- Fill out prep document
- User Scenarios {Functional flow, User Experience behavior} (done by: programmer or tester)
- Y/N needs SME deliverables (architecture view, etc.) (roles: tester, SME)
- Refine User Story Acceptance Criteria (roles: everyone)
- Update Assumptions (roles: everyone)
- PO and SME will visit each team member (individuals and interactions per the Agile Manifesto) before Grooming Meeting and see how they can help.
- ScrumMaster will make sure the process is successful and that everyone comes to the Backlog Grooming meeting prepared.
A team reviewing User Stories and grooming prep doc. They are standing up during the action moments for maximum collaboration. Stand up meetings are 30% faster than their "sit down" counterparts. |
During Sprint Planning, the team will adjust further and are expected to refine further even during the Sprint.
Summary
Goal: Automated Tests!But we need people to develop them.
Ideal:
Tester->Automated Tests!
Usually, teams do at least a lightweight Test Plan:
Tester-> Acceptance Criteria-> Automated Tests
Tester-> BDD-> Automated Tests
Tester-> 4D Analysis -> Acceptance Criteria -> Automated Tests
Some teams need more analysis. Find a way to do it using only 10% of your current Sprint to plan for you next Sprint:
Tester->Test Plan->{Acceptance Criteria, BDD, 4D Analysis}-> Automated Tests
Remember you're doing Test Plans to have automated test cases to defend your product from regression. Your Test Plans should be designed to serve this purpose. Having large Test Plan documents was never the goal. If you can produce 2-6 automated test cases from a one page Test Plan, and spend no more than 10% of your Sprint doing Test Plans, then you're on the right track.