Test automation in Agile

test automatedIn Agile, Scrum in particular, the whole team is responsible for delivering a high quality product at the end of a sprint! At the end of a sprint defects should not only have been found, but also fixed appropriately. It’s important to work efficiently, since creating software, testing it, fixing it and retesting it can be quite time consuming. Recently Martijn de Vrieze had two posts on test automation that provide relevant information on how to deal with test automation within Agile/Scrum (All my automated tests are passing, what’s wrong? and Throw-away test automation).

Martijn describes a couple of types of automation:

  • Test driven tests (supposed to fail until that part is developed)
  • Automated regression tests (supposed to pass)
  • Automation to support testing

Test driven tests

The test driven tests are an important part of the development. For this post I would like to stretch this definition a bit to “automated tests needed within the sprint”. To be more clear: unit tests, system tests and functional tests. When the team uses test driven development, developers are required to start with creating unit tests before creating functionality. Test driven development has been described a lot, so I won’t go in depth on this. Automating system tests can be realised by using the same style as the unit tests, dealing with “more than units”. It’s important to realise that automating these tests needs to be done with a risk based approach. Automating everything will create overhead, since not all of these tests need to be repeated a lot. Be aware that for automating functional tests, it is advisable to automate just below the user interface, since the user interface has the largest probability to change.

Every test written within the sprint should be created to help achieve the Definition of Done.  Therefore, at the end of the sprint, all these tests should pass! During the sprint, tests from previous sprints will start failing, since new functionality is added and the expected result will change. If you already created new tests for these situations, the older ones can be removed. Be aware that all results the old test was checking are still checked in the new context. In other cases, you need to update the old test for the new context.

Automated regression tests

Regression tests are needed to check whether the new software doesn’t break any existing software. Since these tests are run at least towards the end of every sprint (but preferably more often) these tests are a good candidate for automation. Of course the focus of the regression tests is different form the test driven tests. The focus is more on connectivity and interaction with other systems and even the quality in an end-to-end context. When a regression test fails, there are a couple of options:

  • The context has changed and the test is no longer valid, then update or delete the test.
  • The newly created software has a defect which needs to be fixed this sprint.
  • The existing software shows a previously undiscovered defect (which needs to be fixed).
    • Product owner needs to decide who fixes, when to fix and if the feature that shows the defect should go live.

Automation to support testing

To enable faster and more effective testing, testers may require automation. This type of automation does not need to be very robust. Think of a simple recorded script to get you through the necessary steps of logging in to an application. This script only works with the credentials you used while recording, but that might be just what you need. A simple automation like this can save you a lot of time, making sure that you can start testing the functionality that comes available after logging in. Another helpful part of automation can be to extract system information that is needed for defect logging. Delivering quick and dirty automation solutions to help testers is good enough, since the time invested in this automation needs to be earned back.

One comment on “Test automation in Agile

  1. Pingback: Five Blogs – 28 April 2012 « 5blogs

What do you think of this post?