quarta-feira, 3 de abril de 2019

30 Days of agile testing — Day 12: What Test Documentation does your team have and what can you do to improve it?

Many places I have worked at use a wide array of test documentation, from gigantic lumbering packs of thousands of manual, mostly out-dated tests to teams that had no test documentation at all. Every team has their own approach when it comes to test documentation, but one thing that every organisation seemed to share; was that, nobody wanted to do it, but why is that?


  • Too little time
  • Too little value
  • Out-dated before it has been written
  • Never read
  • Boring


So how do (we) write tests, that stay up to date, provide value and are… interesting to write? Yes I said it, test documentation that is interesting, heck, FUN to write?!

Well most of you should have already guessed, but, the answer is, test automation.

Our test automation is split between unit tests, integration tests, mocked UI tests and end to end espresso tests with the goal being, to allow the user to gain an contextually relevant understanding of an aspect of the system.

Unit tests provide documentation of how functions and classes of the application are used and implemented.

The Integration tests document how the application communication with external services.

The mocked ui tests provide documentation on how to generate a specific screen, which screens it can talk to and which external components it requires, e.g. API calls.

Finally the end to end tests provide a break down of how a full user journey is expected to work, without duplicating what has already been covered as part of lower level tests. The goal of the end to end tests is to document and check the combined components of the application and configuration come together as we expect to meets the users requirements.

But how do we know we are writing automated tests for correct functionality and at the correct level? As mentioned in a previous blog post, we try to ensure all stories we play are one point. This enables us to easily quantity and understand the acceptance criteria. With concise acceptance criteria comes the ability to easily understand the bulk of what needs to be tested. We then take the refined acceptance criteria and review the correct level at where the automated should be targeted, during, refinement. This means that when a feature has been completed, it has already been smoke tested and automated. So our testing setup is now fully operational… right?


Not quite, we still need to actually test the software, as so far, all we have written down, with the exception the refinement meeting, is related to checking (http://www.developsense.com/blog/2009/08/testing-vs-checking/). We still need to contribute some time to exploratory testing to challenge ourselves to find issues that may have fallen through the gaps. So how do we document that? One method, is mind-maps.

The mind-maps are used to explore the feature that has been developed, they expose where we are spending most of our testing time and encourage us to ask “well why are we only testing this part of the application?”. In our team, we use a mind-map template, with some basic sample nodes:


  • Questions
  • Observations (bugs)
  • Acceptance Criteria
  • A node and sub-nodes for the most common devices our users use


Using the template, the engineer assigned to testing, completes the mind-map by adding ideas and checks based on the context of what they are testing. If they are checking a image uploader, they may create a node with a set of sample images parameters for example.

Once complete, the mind-maps are saved somewhere locally, but most importantly, any missed automated checks are identified. This is then fed back into the next sprint planning meeting, when similar functionality is identified and this time, the test is identified up-front.

This method of continuously updating our regression pack during story refinement allows us to improve our up-front estimates when planning the work and testing related to specific tickets and removing the manual check from the next exploratory testing session.

Working with the developers and product owner, to ensure we are always thinking about how we will test something, has allowed us to focus on the testability into all aspects of the application.

Many organisations write test plans, with the goal being to make the testing activities more visible, however many test plans end up becoming out-dated and hardly read because they cannot keep pace with the project. But if not in a test plan, then where can we raise the visibility of testing?

Bringing the topic of testing into story refinement, exposes, early the risks and challenges associated with testing in a visible public manner. When the topic of testing is raised, questions can be asked and any assumptions can be clarified. If the testing considerations are then fed back into the story, it can also feed into the automation pack that makes up the living documentation for your project.

As a final piece of extra reading, the two articles below outline an excellent way to provide lightweight test plans, that help you to focus on value first when writing a test plan.

Sem comentários:

Enviar um comentário

30 Days of agile testing — Day 12: What Test Documentation does your team have and what can you do to improve it?

Many places I have worked at use a wide array of test documentation, from gigantic lumbering packs of thousands of manual, mostly out-dated ...