Typically people use a Gherkin-syntax tool like Cucumber for BDD, but here we will explore scenarios and set up test cases with an approval testing approach. We look at the key aspects of BDD:
and how each is affected by choosing an Approval testing approach.
We will go through a worked example in small groups and practice doing BDD with an approval testing approach. Each group can agree which programming language to use for the automation part, with a choice of Java, C#, Python, C++ and Go.
(short break)
(short break)
(short break)
(short break)
What do good test cases look like, can approval tests be good test cases.
In pairs, write down a list of criteria you could use to assess a unit test you were shown. What are some important design considerations?
I have gathered together some test cases in the Lift-Kata-Sample-Tests repo. Have the pairs walk around the room looking at the code snippets you’ve pinned up on the walls. Have them read the code and try to work out what the test is doing. Assess the test against the criteria they just came up with. Add new criteria as it occurs to them.
Encourage people to circulate and not spend too long on one code snippet. Tell them it’s ok to not understand everything about the code, it could be a sign that it has room for improvement.
The idea of the code snippets is that in your judgement, some are better than others. Some are short, some are long, some contain loops, some have too much detail, some have multiple asserts, some use unfamiliar test frameworks, etc.
When everyone has been looking at the code for about 15-20 minutes, bring it back to whole group discussion. At a flipchart or whiteboard, ask people to volunteer what they’ve learnt. You want to know the criteria they came up with, and which test snippets on the wall are the best/worst. You might find disagreement in the group which could be interesting to know but possibly not helpful to have a big discussion about in a whole group. Try to focus on what people agree on.
Show the plain vanilla TDD implementation of Leap Years first. Don’t forget the first step, which is writing up the four test cases on the whiteboard. Implementing the whole Kata only takes a few minutes, and they will probably have seen it before in a previous session. Show it to them again so they remember better.
Key aspects of BDD: double loop TDD. Do some ‘Discovery’ and ‘Formulation’.
In pairs, tell the other person what you already know about BDD. Have you done it? Sort people into small groups so that hopefully someone in each group knows something about BDD.
This activity is described here
Explain the inner loop is the ordinary Red-Green-Refactor loop that we have in TDD. The outer loop is Failing scenario - Passing scenario - Refactor. The rhythm of the outer loop is slower. A scenario exercises a thicker chunk of code than a unit test. A scenario is understandable by everyone in the team, including business representatives.
Include demos of Cucumber BDD and Approvals BDD. Concept of centered community.
Write sticky notes - perhaps on your BDD diagram.
Discuss what user stories you can think of based on own experience.
Explain the DSL we’ll be using for this exercise
Explain the Lift Kata and show the list of requirements. You can act as the business representative if they have questions about the requirements.
In small groups or pairs, get people to come up with examples and formulate them using the DSL.
Get small groups to present their examples and formulations to one another.
Ask people to discuss in pairs. Do the tests that come from the new approach have all the properties they came up with during the earlier part of the session?
Can we make the tests we formulated earlier into automated tests
There is some starting code for doing this Kata with an Approvals approach on my github. Have people read the README file and particularly the explanation of the printer. Make sure everyone has the starting code set up on their machines and the tests pass.
Look at the formulated test cases we made earlier. Choose one to work on automating.
Have people automate a formulated test case in this codebase. They should work in pairs
Pick a couple of pairs and ask them to put their test code up on the screen. Compare the test code to the scenario formulation they started with. Do they look the same?
Domain Specific Languages
Ask them to search the internet for examples of Domain Specific Languages. They should find things like:
Give scenarios to groups and ask them to sketch one or two test cases for each scenario
Get people to look through everything we’ve done today and discuss with someone what they’ve learnt. Write answers on the mind-map as sub-nodes. Mindmap statements
What are you taking away with you today? TODO: better retrospective format