Growing need for technical testers

technical testThis is a translation of my article for the “Testkrant Jaargang 2012 Editie 1“. The article is based on my experiences at Polteq.

The world changes and this sets new requirements for testers, especially the technical aspect of testing. Cloud computing enables us to use functionality from the cloud. For cloud computing we need to test if the functionality of the cloud service correctly supports the business processes, but we also need technical tests to check if the service correctly integrates with the existing systems. We also need to monitor the service in production by using test automation, which needs technical skills.

The other big “hype” that requires technical testers is Agile. By using multi disciplinary teams, the tester gets closer to the developer. Direct communication between these two functions will increase since less documentation is used. To enable good communication, the testers will need to know something about development and vice versa. The iterative and incremental character of Agile requires to execute regression tests more often. This demands for more test automation. This article will describe some situations for cloud, Agile and test automation that will clarify the need for technical testers.

Cloud

A growing number of companies uses services from the cloud (e.g. e-mail, CRM or environments). All services are provided via the internet and should be integrated in the current business processes. Testing the communication with the service is a technical task. The focus will be on the messages that are used to communicate with the service which requires looking below the GUI.

Using cloud services has an impact on internal development projects. Whenever we need to test chains of events after internal changes a connection to the (live) service is often unwanted. So we need to use a stub or mock to simulate the cloud service in order to test our processes without touching the production version of the cloud service.

A service comes with agreements and contract with the supplier about performance and other aspects of the service in production. We need to check in production if the supplier will keep up to the agreements. There is the need for frequent (or even continuous) tests that will monitor the service. This implies test automation which will be covered further on in this article.

Agile

Agile uses four simple basic priciples:

  • Individuals and interactions over processes and tools
  • Working software over comprehensive documentation
  • Customer collaboration over contract negotiation
  • Responding to change over following a plan

The importance of interaction can be found within the multi disciplinairy teams. Less documentation can only work of the team members communicate well on what they are doing. To understand what the other team members are doing, you will need to know about the other disciplines. We expect from our testers that they have a basic knowledge of programming and have some basic programming skills. This will help to discuss about software architecture and possible solutions, which will be defined by the team.

The short development cycles force us to test incomplete products. The tester will need to apply the use of drivers, stubs and mocks to be able to test. Creating or setting up these facilities to support testing needs to be signaled (and preferably also executed) by the testers.

The working software that gets delivered every iteration will need to work together with the previously delivered items. Each iteration the functionality will increase which also reflects on the size of the regression test set. This set grows, but the time to execute it is only limited. The frequent execution of the regression test set, together with the time pressure, make a good business case for test automation. You should not think to lightly of test automation within Agile.

Another requirement as a result of the frequent deliveries is good version and configuration management. The testers will need to help to make this a succes. When and how to integrate? Which demands do we have for the different environments? Technical knowledge will help to get a better and clearer view of what should happen.

Test automation

No need to explain that test automation has technical aspects. Since test automation is an important part of testing with cloud computing and within Agile, test automation deserves some special attention.

Test automation can occur on different levels. The foundation for test automation is set with the unit test. These unit tests are in general created by the developers, but this does not mean that testers are not involved. We can help the developers apply test design techniques, since we have more knowledge of testing. Pair programming of a tester with a developer can be of value here. The tester can assist the developer in thinking about which test cases need to be created. In addition to this assisting, we can also review the unit tests when we have enough knowledge and skills with respect to the programming language.

Functional test automation is closer to the testers. With Agile development the graphical user interface keeps changing to meet the (changing) requirements, so succesfull automation needs to be at a layer below the GUI. So once again we need to know how to manipulate the code to be able to test the functionality. When automation is used that does use the graphical user interface, this GUI must be manipulated in some manner. When we use Fitnesse or Cucumber to achieve this, it seems like little or no technical work, but we do need some programming. Both Fitnesse and Cucumber need an interpretation layer to talk to the test object. Allthough both come with a lot of standard functions and options, they almost never fit completely for your specific situation. Therefore we need to add some functionality, which needs to be programmed. More and more it is expected from the tester that they can do this.

Multiple contexts require continuous testing. Within Agile for the integration and regression, with cloud for monitoring the supplier, etc. The only way to achieve continuous (or at least very frequent) testing is by automation. Thinking about how to realise this and actualy implementing it requires knowledge of architecture, programming and testing.

Conclusion

The software we create is getting more and more complex. This does not only require more technical knowledge from the developers, but also from the testers. In test jobs we see requests for programming skills (preferably in multiple programming languages) and people expect the testers to be able to discuss about architectural problems found during development. The integration of different systems, as seen with cloud computing, requires the testers to be able to read technical logging, use drivers, stubs and mocks, etc. The closeness that we find in the Agile teams implies more technical discussions and communication.

The tester needs to keep up in the quickly changing world, where Agile and cloud are becoming the standard. The only way to keep up with the speed of development is to use test automation. The tools that will be needed for this should not only be used, but also understood and implemented. So yes, the need for technical testers grows.

Keep learning and broaden your knowledge, so that we – as a test community – can fill this need!

Advertisement

Testing gets disqualified

SoftwareTestingLast week I read an article (in Dutch) which presented some negative views on testing and IT people in general. “Just like other professionals they don’t like to take the blame.” Technology is unreliable and the -superior- human always has to fix it. The writer states that the root cause in 80 percent of the incidents is human failure. This seems strange; since the technology is developed and used by humans I’d say that this should be 100 percent… But the reasoning of the writer has some more strange ways. “From humans we can expect that a failure is made every now and then, but from machines we may expect more.” I think that the writer underestimates the humans in this case. Humans can recognize when something goes wrong. When they see something goes wrong, they can correct it, where a machine cannot think.

“All people involved seem to be stuck in sub optimal processes and don’t make use of modern technology” Some people indeed are stuck in processes and in some areas I think that is a good thing. But more and more development is Agile and will adapt. The next point actually felt insulting. “Instead of ‘fooling around’ IT people rather use the word ‘testing’.” So here we have a ‘manager’ disqualifying the work I love to do. I don’t have the feeling that he has any idea of what he is talking about.

“Let IT do what it’s good in, automate.” This is the only paragraph of the article where I had some recognition. Automation, more specific test automation, is used too little in the software development. I also see that in a lot of development processes people say that automation is too difficult, or too expensive. By now we must understand that it will cost you to invest in (test) automation, but if chosen well on what to automate, it will pay back.

I read the article just before I went away for a couple of days. I actually expected a lot of responses about how wrong the author was and that he made some statements that are clearly his opinion, but not based on any thorough research. Strange enough, it got positive responses… So I just had to write about it.

Test automation in Agile

test automatedIn Agile, Scrum in particular, the whole team is responsible for delivering a high quality product at the end of a sprint! At the end of a sprint defects should not only have been found, but also fixed appropriately. It’s important to work efficiently, since creating software, testing it, fixing it and retesting it can be quite time consuming. Recently Martijn de Vrieze had two posts on test automation that provide relevant information on how to deal with test automation within Agile/Scrum (All my automated tests are passing, what’s wrong? and Throw-away test automation).

Martijn describes a couple of types of automation:

  • Test driven tests (supposed to fail until that part is developed)
  • Automated regression tests (supposed to pass)
  • Automation to support testing

Test driven tests

The test driven tests are an important part of the development. For this post I would like to stretch this definition a bit to “automated tests needed within the sprint”. To be more clear: unit tests, system tests and functional tests. When the team uses test driven development, developers are required to start with creating unit tests before creating functionality. Test driven development has been described a lot, so I won’t go in depth on this. Automating system tests can be realised by using the same style as the unit tests, dealing with “more than units”. It’s important to realise that automating these tests needs to be done with a risk based approach. Automating everything will create overhead, since not all of these tests need to be repeated a lot. Be aware that for automating functional tests, it is advisable to automate just below the user interface, since the user interface has the largest probability to change.

Every test written within the sprint should be created to help achieve the Definition of Done.  Therefore, at the end of the sprint, all these tests should pass! During the sprint, tests from previous sprints will start failing, since new functionality is added and the expected result will change. If you already created new tests for these situations, the older ones can be removed. Be aware that all results the old test was checking are still checked in the new context. In other cases, you need to update the old test for the new context.

Automated regression tests

Regression tests are needed to check whether the new software doesn’t break any existing software. Since these tests are run at least towards the end of every sprint (but preferably more often) these tests are a good candidate for automation. Of course the focus of the regression tests is different form the test driven tests. The focus is more on connectivity and interaction with other systems and even the quality in an end-to-end context. When a regression test fails, there are a couple of options:

  • The context has changed and the test is no longer valid, then update or delete the test.
  • The newly created software has a defect which needs to be fixed this sprint.
  • The existing software shows a previously undiscovered defect (which needs to be fixed).
    • Product owner needs to decide who fixes, when to fix and if the feature that shows the defect should go live.

Automation to support testing

To enable faster and more effective testing, testers may require automation. This type of automation does not need to be very robust. Think of a simple recorded script to get you through the necessary steps of logging in to an application. This script only works with the credentials you used while recording, but that might be just what you need. A simple automation like this can save you a lot of time, making sure that you can start testing the functionality that comes available after logging in. Another helpful part of automation can be to extract system information that is needed for defect logging. Delivering quick and dirty automation solutions to help testers is good enough, since the time invested in this automation needs to be earned back.

Agile will fail

No Agile hereWhat quite some Agile consultants tend to do, is implement Agile/Scrum by the book without looking at the actual context. Most of these consultants define an approach that will need to have all layers of an organisation involved. On all these layers, a change process is needed to make a good transition to Agile. Guidance, for the large changes that are needed, is usually only supplied at the team level. This leaves middle and upper management, after a couple of awareness sessions, unable to efficiently deal with Agile/Scrum. After the Agile Consultants have left (most of the time they are too expensive to stay for a long period of time), the company has to carry on working with the mess that is left behind: a process with little Agility due to the remains of the “old process”.

Teams know what they should do, but cannot act since backlog items are not provided in the correct manner or upper management still has a very strict release calendar. Freedom of choice is not present and the team members are forced back into functions instead of roles. Management still thinks in these functions too and will judge people based on their functions instead of based on their performance and value added as a team. People who are not able to work in an Agile setting are kept on board for their “critical knowledge”, but they hinder the Agility of the process and the team by not sharing their knowledge.

Another mistake often made by companies is removing management (project management, test management, …). Especially for larger projects, management and a senior architect are still needed. Successful projects need people that keep a “helicopter view” to make sure that all the parts will work together. You need to know all the connections with other (parts of the) systems to incorporate the risks involved. Large parts of risk mitigation need to happen at this level and will fail without proper management.

Yet another strange phenomenon is not adapting the organisation to the Agile manifesto. The four rules are so clear, but in the transition to Agile the required documentation by various parts of the organisation is usually not re-evaluated. Most maintenance departments still demand exhaustive documentation. Yes, of course there needs to be some documentation, but in most cases the amount of documentation can be much lower while still delivering the essential knowledge.

For Agile to succeed properly, test automation should be an essential part of every sprint. An often heard claim about test automation: “We don’t do it, because it is too difficult in our setting.” This usually proves to be false… The teams do not realise that the changed context has impact on previous decisions on test automation. People often just do not want to invest in test automation, since the benefits are not made clear enough. On short term the business will only see costs without the benefits. If you want test automation to be supported, define a business case! Describe when test automation will start to pay off and what the initial costs will be. Above all make a plan on how to introduce test automation. It is not just using a tool, but a selection process on what to automate and which tool will be most suitable in your context.

Of course I don’t think that Agile will fail, but this is merely a reflection of what I see happening in many companies: a rigid, by the book, implementation with no room for context or Agility. Luckily I have also seen quite some good implementations of Agile/Scrum.

Do you recognize these problems or have problems to add?