Test automation in Agile

test automatedIn Agile, Scrum in particular, the whole team is responsible for delivering a high quality product at the end of a sprint! At the end of a sprint defects should not only have been found, but also fixed appropriately. It’s important to work efficiently, since creating software, testing it, fixing it and retesting it can be quite time consuming. Recently Martijn de Vrieze had two posts on test automation that provide relevant information on how to deal with test automation within Agile/Scrum (All my automated tests are passing, what’s wrong? and Throw-away test automation).

Martijn describes a couple of types of automation:

  • Test driven tests (supposed to fail until that part is developed)
  • Automated regression tests (supposed to pass)
  • Automation to support testing

Test driven tests

The test driven tests are an important part of the development. For this post I would like to stretch this definition a bit to “automated tests needed within the sprint”. To be more clear: unit tests, system tests and functional tests. When the team uses test driven development, developers are required to start with creating unit tests before creating functionality. Test driven development has been described a lot, so I won’t go in depth on this. Automating system tests can be realised by using the same style as the unit tests, dealing with “more than units”. It’s important to realise that automating these tests needs to be done with a risk based approach. Automating everything will create overhead, since not all of these tests need to be repeated a lot. Be aware that for automating functional tests, it is advisable to automate just below the user interface, since the user interface has the largest probability to change.

Every test written within the sprint should be created to help achieve the Definition of Done.  Therefore, at the end of the sprint, all these tests should pass! During the sprint, tests from previous sprints will start failing, since new functionality is added and the expected result will change. If you already created new tests for these situations, the older ones can be removed. Be aware that all results the old test was checking are still checked in the new context. In other cases, you need to update the old test for the new context.

Automated regression tests

Regression tests are needed to check whether the new software doesn’t break any existing software. Since these tests are run at least towards the end of every sprint (but preferably more often) these tests are a good candidate for automation. Of course the focus of the regression tests is different form the test driven tests. The focus is more on connectivity and interaction with other systems and even the quality in an end-to-end context. When a regression test fails, there are a couple of options:

  • The context has changed and the test is no longer valid, then update or delete the test.
  • The newly created software has a defect which needs to be fixed this sprint.
  • The existing software shows a previously undiscovered defect (which needs to be fixed).
    • Product owner needs to decide who fixes, when to fix and if the feature that shows the defect should go live.

Automation to support testing

To enable faster and more effective testing, testers may require automation. This type of automation does not need to be very robust. Think of a simple recorded script to get you through the necessary steps of logging in to an application. This script only works with the credentials you used while recording, but that might be just what you need. A simple automation like this can save you a lot of time, making sure that you can start testing the functionality that comes available after logging in. Another helpful part of automation can be to extract system information that is needed for defect logging. Delivering quick and dirty automation solutions to help testers is good enough, since the time invested in this automation needs to be earned back.

Advertisement

Agile will fail

No Agile hereWhat quite some Agile consultants tend to do, is implement Agile/Scrum by the book without looking at the actual context. Most of these consultants define an approach that will need to have all layers of an organisation involved. On all these layers, a change process is needed to make a good transition to Agile. Guidance, for the large changes that are needed, is usually only supplied at the team level. This leaves middle and upper management, after a couple of awareness sessions, unable to efficiently deal with Agile/Scrum. After the Agile Consultants have left (most of the time they are too expensive to stay for a long period of time), the company has to carry on working with the mess that is left behind: a process with little Agility due to the remains of the “old process”.

Teams know what they should do, but cannot act since backlog items are not provided in the correct manner or upper management still has a very strict release calendar. Freedom of choice is not present and the team members are forced back into functions instead of roles. Management still thinks in these functions too and will judge people based on their functions instead of based on their performance and value added as a team. People who are not able to work in an Agile setting are kept on board for their “critical knowledge”, but they hinder the Agility of the process and the team by not sharing their knowledge.

Another mistake often made by companies is removing management (project management, test management, …). Especially for larger projects, management and a senior architect are still needed. Successful projects need people that keep a “helicopter view” to make sure that all the parts will work together. You need to know all the connections with other (parts of the) systems to incorporate the risks involved. Large parts of risk mitigation need to happen at this level and will fail without proper management.

Yet another strange phenomenon is not adapting the organisation to the Agile manifesto. The four rules are so clear, but in the transition to Agile the required documentation by various parts of the organisation is usually not re-evaluated. Most maintenance departments still demand exhaustive documentation. Yes, of course there needs to be some documentation, but in most cases the amount of documentation can be much lower while still delivering the essential knowledge.

For Agile to succeed properly, test automation should be an essential part of every sprint. An often heard claim about test automation: “We don’t do it, because it is too difficult in our setting.” This usually proves to be false… The teams do not realise that the changed context has impact on previous decisions on test automation. People often just do not want to invest in test automation, since the benefits are not made clear enough. On short term the business will only see costs without the benefits. If you want test automation to be supported, define a business case! Describe when test automation will start to pay off and what the initial costs will be. Above all make a plan on how to introduce test automation. It is not just using a tool, but a selection process on what to automate and which tool will be most suitable in your context.

Of course I don’t think that Agile will fail, but this is merely a reflection of what I see happening in many companies: a rigid, by the book, implementation with no room for context or Agility. Luckily I have also seen quite some good implementations of Agile/Scrum.

Do you recognize these problems or have problems to add?

Monitor or test?

The webinar that Ruud Teunissen presented was based on the architecture of Cloutest. It was great to hear how Ruud told about the approach! I feel very proud to have contributed to this approach and to be one of the authors of the book. The cloud is hot, which was proven by the large number of questions that was asked at the end of the webinar. I want to compliment Ruud with how he handled the questions! As the result of this webinar, I hope that people realise that testing really needs to adapt to the way that software is used nowadays. Much uncertainty is introduced with the use of cloud computing, but by selecting the right service and supplier you can allready mitigate a lot of risks. Don’t forget to keep testing in production, to check if “everything still works”. One of the questions mentioned that this is monitoring and not testing. I think we need to broaden our definition of testing and make monitoring a part of testing. Realise yourself that the maintenance department might not monitor on the service, since it is maintained by the supplier… I think what needs to be done in production is testing! A continuous end-to-end test to check if the processes still work, is that monitoring or testing? Doesn’t the word test in E2E test imply a testing activity?

Edit: This topic is also active at the Cloutest site.

Attended the Testnet testing dojo

DojoSo, the 4th of April I attended my first testing dojo. Fun to see that most people are up for their first time. A large group of people showed up, but we had a relatively small room to work in.

After a very short introduction of Huib Schoots and Peter van Tulder we started testing in groups. The test object was a (not too structured) website. Testing without any specification was harder than I expected. Where to start? What to test? What was the intention by releasing this website? Without asking any of these questions upfront, we started… We had some structure on how to go through the website, but it was a really messy site. So after half an hour of testing all teams came together for a debriefing. People found bugs, but were they bugs?

Before testing another website, we got some leads on how to get focus on what to do. One suggestion was to use SFDEPOT to address different aspects of the software. Luckily the second test object had a clearer scope 🙂 We only got 15 minutes to test this one, so we focussed on a part of the functionality. Though we had more structure this time, I still had trouble in creating structure in what to do. Luckily the partnering provided guidance and in the end I felt we did the right things.

This dojo has opened my eyes on how much I rely on scripted testing and I will need to practice more on my exploratory skills. I had a great evening and hope Testnet will do this more often (hopefully in smaller groups)!