Experiences at EuroSTAR 2012 (part 2)

In the previous post I described my experiences at the Tuesday of EuroSTAR 2012. In this post I will continue my EuroSTAR experience with the Wednesday.

Changing Management Thinking – John Seddon

A nice quote from this talk about changing management thinking: “The primary cause of failing is management”. Managers tend to make decisions that are not in the benefit of projects. For example when you want to decrease the costs, managers start managing on costs… This actually increases the costs in most cases. However, when you manage on value, this will more often decrease the costs. A very useful story that John told us, was about chicken wings and spare ribs. Management of a large chain of restaurants decided to replace the spare ribs as a starter with chicken wings, since the chicken wing had a larger margin. Customers were disappointed and asked the waiters if they could get a small portion of ribs (still available as a main) as a starter. The waiters want to please their customers, so they say that it is possible. Then the fun part will start… The waiter needs to put the starter in the cash register and since there is no starter of spare ribs listed, they choose to file it under chicken wings. Since management reads the registers and sees that chicken wings are sold very often, they order more chicken wings for their restaurants. A fine example of failing management.

Adventures in Test Automation – Breaking the Boundaries of Regression Testing –

 John Fodeh

John provided information on automated monkey testing. The presentation was supported by using some scenes from the IT Crowd to inform us on automation.  Automated monkey testing p roved to be an easy to understand concept: by randomizing each step, you are simulating monkey testing. The problem of course is that it is easy to miss out on obvious defects, it does not effectively emulate real scenarios and debugging lng test runs can be quite a pain. They felt the need to create more intelligent monkeys by creating somewhat more expectable behavior via the use of state tables with probabilities per action.

Evolving Agile Testing – Fran O’Hara

After a short introduction on Agile and SCRUM, Fran started off on requirements. When we start to talk about user stories, we should try to find out about acceptance criteria for the story. This serves several goals:

  • Define the boundaries for a user story/feature
  • Help the product owner to find out what it is that delivers value
  • Help the team gain understanding of the story
  • Help developers and testers to derive tests
  • Help developers when to stop adding functionality to a story

Fran reminds us to keep these acceptance criteria at a relatively high level, so do not lose yourself in too much details. Detailing will be done in e.g wireframes, mockups or validation rules. Another place where we find detailing is in the automated acceptance tests. Try to find examples that support your acceptance criteria.

Next Fran stresses the fact that we still need test strategy in Agile. We need to think about the minimal tests in the sprints (automated unit, automated acceptance, manual exploratory) a

nd sometimes need to do some additional testing e.g for non-functionals, feature integration or business processes. The testers themselves need to have broad knowledge (more than just testing) and deep knowledge in testing. This requires a ‘technical awareness’.

Testing of Cloud Services; The Approach: From Risks to Test Measures – Kees Blokland & Jeroen Mengerink

Kees and myself presented Cloutest® our approach to testing cloud services. We started off with an introduction to cloud computing to set the context. To properly introduce the concept, we decided to use the definition provided by NIST. After this into our approach. We identified 143 risks that arise when using cloud computing and grouped these risks into categories:Cloutest-Eurostar

  • Performance
  • Security
  • Availability & Continuity
  • Functionality
  • Manageability
  • Legislation & Regulations
  • Suppliers & Outsourcing

Since 143 risks is quite a lot, we decided to give a limited set of examples of risks and detail these. For instance there occurs a performance risk, since a cloud service usually has several customers. So it’s not only you as a customer that is putting load on the service, but also other users of other customers. This will influence the performance of the service. Imagine your webshop hosted at the same hosting provider that hosts WikiLeaks… The huge amounts of traffic that a new publication on WikiLeaks will generate, might result in your webshop not being available due to performance problems of the service.

With testing we provide methods to mitigate risks, so that is what we did too. The good news is that we can still use a lot of what we have learned over the years. Some techniques need to be tweaked to fit in the cloud context, but they are very useful. Next to the tweaked measures, we also describe some new measures that we have used at our clients. We grouped the measures too:

  • Selection
  • Performance
  • Security
  • Manageability
  • Availability & Continuity
  • Functional
  • Migration
  • Legislation & Regulations
  • Production

How to test the scalability of a cloud service??? Providers promise scalable services and customers pay per use, so if you need more, you will get more. With traditional load testing, we can gradually increase the load and see how the system responds. This can be applied to the service too, but it will scale. You will see the point where the scaling starts in your response times, they will drop when more performance is added at the service. Check around the boundary of the scaling point to see if the billing is also scaled.

We see that test starts earlier, the scope is wider and testing will not stop in production.

Inspirational Talk: Sky is not the limit: Copenhagen Suborbitals – Peter Madsen

The inspirational talk was very nice, however not very test related. It showed us that with the right vision and perseverance you can reach goals that seem to be unreachable. Peter showed us how he built a homemade submarine and a homemade rocket.

Advertisement

Growing need for technical testers

technical testThis is a translation of my article for the “Testkrant Jaargang 2012 Editie 1“. The article is based on my experiences at Polteq.

The world changes and this sets new requirements for testers, especially the technical aspect of testing. Cloud computing enables us to use functionality from the cloud. For cloud computing we need to test if the functionality of the cloud service correctly supports the business processes, but we also need technical tests to check if the service correctly integrates with the existing systems. We also need to monitor the service in production by using test automation, which needs technical skills.

The other big “hype” that requires technical testers is Agile. By using multi disciplinary teams, the tester gets closer to the developer. Direct communication between these two functions will increase since less documentation is used. To enable good communication, the testers will need to know something about development and vice versa. The iterative and incremental character of Agile requires to execute regression tests more often. This demands for more test automation. This article will describe some situations for cloud, Agile and test automation that will clarify the need for technical testers.

Cloud

A growing number of companies uses services from the cloud (e.g. e-mail, CRM or environments). All services are provided via the internet and should be integrated in the current business processes. Testing the communication with the service is a technical task. The focus will be on the messages that are used to communicate with the service which requires looking below the GUI.

Using cloud services has an impact on internal development projects. Whenever we need to test chains of events after internal changes a connection to the (live) service is often unwanted. So we need to use a stub or mock to simulate the cloud service in order to test our processes without touching the production version of the cloud service.

A service comes with agreements and contract with the supplier about performance and other aspects of the service in production. We need to check in production if the supplier will keep up to the agreements. There is the need for frequent (or even continuous) tests that will monitor the service. This implies test automation which will be covered further on in this article.

Agile

Agile uses four simple basic priciples:

  • Individuals and interactions over processes and tools
  • Working software over comprehensive documentation
  • Customer collaboration over contract negotiation
  • Responding to change over following a plan

The importance of interaction can be found within the multi disciplinairy teams. Less documentation can only work of the team members communicate well on what they are doing. To understand what the other team members are doing, you will need to know about the other disciplines. We expect from our testers that they have a basic knowledge of programming and have some basic programming skills. This will help to discuss about software architecture and possible solutions, which will be defined by the team.

The short development cycles force us to test incomplete products. The tester will need to apply the use of drivers, stubs and mocks to be able to test. Creating or setting up these facilities to support testing needs to be signaled (and preferably also executed) by the testers.

The working software that gets delivered every iteration will need to work together with the previously delivered items. Each iteration the functionality will increase which also reflects on the size of the regression test set. This set grows, but the time to execute it is only limited. The frequent execution of the regression test set, together with the time pressure, make a good business case for test automation. You should not think to lightly of test automation within Agile.

Another requirement as a result of the frequent deliveries is good version and configuration management. The testers will need to help to make this a succes. When and how to integrate? Which demands do we have for the different environments? Technical knowledge will help to get a better and clearer view of what should happen.

Test automation

No need to explain that test automation has technical aspects. Since test automation is an important part of testing with cloud computing and within Agile, test automation deserves some special attention.

Test automation can occur on different levels. The foundation for test automation is set with the unit test. These unit tests are in general created by the developers, but this does not mean that testers are not involved. We can help the developers apply test design techniques, since we have more knowledge of testing. Pair programming of a tester with a developer can be of value here. The tester can assist the developer in thinking about which test cases need to be created. In addition to this assisting, we can also review the unit tests when we have enough knowledge and skills with respect to the programming language.

Functional test automation is closer to the testers. With Agile development the graphical user interface keeps changing to meet the (changing) requirements, so succesfull automation needs to be at a layer below the GUI. So once again we need to know how to manipulate the code to be able to test the functionality. When automation is used that does use the graphical user interface, this GUI must be manipulated in some manner. When we use Fitnesse or Cucumber to achieve this, it seems like little or no technical work, but we do need some programming. Both Fitnesse and Cucumber need an interpretation layer to talk to the test object. Allthough both come with a lot of standard functions and options, they almost never fit completely for your specific situation. Therefore we need to add some functionality, which needs to be programmed. More and more it is expected from the tester that they can do this.

Multiple contexts require continuous testing. Within Agile for the integration and regression, with cloud for monitoring the supplier, etc. The only way to achieve continuous (or at least very frequent) testing is by automation. Thinking about how to realise this and actualy implementing it requires knowledge of architecture, programming and testing.

Conclusion

The software we create is getting more and more complex. This does not only require more technical knowledge from the developers, but also from the testers. In test jobs we see requests for programming skills (preferably in multiple programming languages) and people expect the testers to be able to discuss about architectural problems found during development. The integration of different systems, as seen with cloud computing, requires the testers to be able to read technical logging, use drivers, stubs and mocks, etc. The closeness that we find in the Agile teams implies more technical discussions and communication.

The tester needs to keep up in the quickly changing world, where Agile and cloud are becoming the standard. The only way to keep up with the speed of development is to use test automation. The tools that will be needed for this should not only be used, but also understood and implemented. So yes, the need for technical testers grows.

Keep learning and broaden your knowledge, so that we – as a test community – can fill this need!

Testing in China

ChinaTest 2012Last 9 days I’ve been in Shanghai together with Martin Pol for Polteq. Luckily the typhoon didn’t reach the part of Shanghai we were in, we had some wind and rain, but nothing to extreme. We’ve presented some tutorials and tracks at ChinaTest 2012, but also needed to visit a couple of companies to talk about testing. So there was a lot to do in very little time. At one company you speak with a selected group of a couple of people, where at other companies you have a room filled with about a hundred people and a video connection to other offices where also a hundred people are attending our presentation on testing. No matter what the setting was, we would get a lot of questions, since everyone wanted to learn as much as they could from our visit. This was also the case for the conference.

Together we presented a full day tutorial on structured testing. It was great to notice how people at first were a bit hesitant to ask question, but during the day dared more and more. Some concepts that require little explanation in Europe, required more in China and the other way around too. The next day of the conference Martin had one half day tutorial on compact test process improvement and I had two half day tutorials. Both of my tutorials dealt with Agile testing. The first one was on how to work as a tester in scrum teams. At some parts of the tutorial I had groups of people discuss together on some topic and then present their findings to the whole group. Usually I can follow what people talk about and try to guide them a bit in the direction I want them to think, but my Chinese is not good enough to understand the discussions 😉 So I just had to trust that they were talking about the topics I wanted them to. When we started discussing about the topics in English it became clear that they did. The other tutorial was on how to improve testing in Agile (mostly scrum) settings. What struck me was that most questions dealt with automation and tools and not the actual job of testing.

The third day of the conference started with key notes from Martin Pol and Richard Bender. After these keynotes we had to do some more company visits. The last day of the conference we both needed to do some presentations. I presented on how to test mobile apps and on the Polteq approach for testing cloud computing: Cloutest. Both of these topics proved to be very popular! Martin presented on outsourcing for a crowd that seemed to grow during the presentation. The conference ended with a set of lighting talks, where Martin told that we need to save the testing skills and I provided some fast insight into Agile.

Usually I try to provide some information on the different presentations I attended at a conference, but since these were all in Chinese there is not much that I can say about them.

I’d like to thank the organizing committee for a great conference and hope to be able to come back for the 2013 edition!

Test conferences

global

The number of events that deal with software testing seem to increase. This globalisation is great news for the testing community! More and more opportunities to share knowledge and learn from others. Last year my presentation on how to test mobile apps got to several stages and this year the release of the book Cloutest provides some opportunities for presentation slots. A presentation on Cloutest has been accepted for EuroStar and for ChinaTest. Maybe some more events will follow.

Next to Cloutest, my stage appearances will focus around Agile. China is starting to go Agile and ChinaTest has accepted two Agile proposals next to my Cloutest presentation. These will deal on how to work as a tester in Agile teams and how to improve testing in Agile settings.  The Testnet fall event also focuses on Agile, the proposal that got accepted for Testnet deals with Agile and outsourcing.

It’s great to have a lot of proposals accepted, now to prepare each presentation! For the Agile part, there are some basics that need to be dealt with in all the presentations, so let’s think about reusability 🙂 Working at Polteq provides a great setting to discuss about several testing topics. It’s great to get a lot of support and help. For Agile we recently started a discussion group, helping to see common problems and different solutions in Agile settings. Every meeting is an eye opener for me and I hope we will have a lot of informative meetings to come.

Monitor or test?

The webinar that Ruud Teunissen presented was based on the architecture of Cloutest. It was great to hear how Ruud told about the approach! I feel very proud to have contributed to this approach and to be one of the authors of the book. The cloud is hot, which was proven by the large number of questions that was asked at the end of the webinar. I want to compliment Ruud with how he handled the questions! As the result of this webinar, I hope that people realise that testing really needs to adapt to the way that software is used nowadays. Much uncertainty is introduced with the use of cloud computing, but by selecting the right service and supplier you can allready mitigate a lot of risks. Don’t forget to keep testing in production, to check if “everything still works”. One of the questions mentioned that this is monitoring and not testing. I think we need to broaden our definition of testing and make monitoring a part of testing. Realise yourself that the maintenance department might not monitor on the service, since it is maintained by the supplier… I think what needs to be done in production is testing! A continuous end-to-end test to check if the processes still work, is that monitoring or testing? Doesn’t the word test in E2E test imply a testing activity?

Edit: This topic is also active at the Cloutest site.