EDITING BOARD
RO
EN
×
▼ BROWSE ISSUES ▼
Issue 28

Software Testing World Cup

Lavinia Cazacu
Quality Assurance Manager
@Hewlett Packard



Irina Savescu
Quality Assurance Engineer
@Hewlett Packard



TESTING

Software Testing World Cup is an international software testing competition, now in its second year. The competition takes place in two stages: the continental preliminaries (Africa, Asia, Europe, North America, South America and Oceania) where anyone can sign up, and the world finals which take place during the Agile Testing Days conference in Berlin. This year the preliminaries where held between April and July and the finals will take place mid-November.

Irina was the first one to hear about the competition and we all ventured into the idea quickly. None of us had ever participated in a testing competition and we gladly took on the challenge as it sounded so fun and exciting, a good personal and team experience.  So, on June 13, 2014 we, Irina, Ileana, Sanda and Lavinia, took part in this unusual contest, where software testing teams around Europe had to test an application for 3 hours.

Each team who competed in the Europeans round had the same application to test. The organizers set up a dedicated bug tracker project for each team to submit the bugs they find, and at the end of the testing session, we needed to send out a test report, in the format of our choice. Out of more than 100 teams attending the event, we were the winning team.

Even if winning isn’t everything, it surely made our day, as we didn’t really expect it. Once we registered, the hard teeth of reality took a good bite on our excitement. There wasn’t any time to meet, there were just too many things to polish, and contest hours were just too late to be able to be efficient (7 PM on a Friday?). In the end, all of us need to do our day job and go on with our lives, don’t we? But we did meet a couple of times, and we did look around a bit and we even planned a few aspects. Not too many, just enough to have that feeling that we’re serious about it. But not too serious, so it won’t be too painful if we have no results.

The day of the contest is never as you wish to be. Still, we managed to draft the report we were all ok to use, and had a lovely dinner right before the contest started (having fun is still the best energizer we know). After that, it was an intense 3 hours struggle, at the end of which we felt like anything but a winning team. Throughout those hours all we really wanted was to do more and more. In our eyes we failed. We had trouble getting the right grip on the application, and we were too absorbed in our own ideas, always looking to catch the thing we just noticed.

The application under test was a sales tool demo, a concept new for us, and the business logic was not very intuitive. What’s more, it had dependencies on real data and there was no way to generate your test data on the spot. We had to google for real companies in specific countries and understand the specific formats in those countries (legal entity names, contact details etc.) and integrate the rules in our testing scenarios. Once you entered the details of a real business, and the tool recognizes it, it offers the possibility to generate a profile for that business and incorporate details like online footprint of the business, competitors and weak areas. The profile generation took around 10 min, and for a time box testing session, you just can’t afford to wait for a process to complete. Many times, we forgot to check it right away, and many of our testing flows died right away without being able to close the loop and asses the profile data. There was just no time to wait for results, and we sacrificed aspects that we would normally never ignore. The whole application was buggy, and some of the bugs were so obvious that you did not want to waste time on reporting them, but continue the hunt. Even if it was a real, production application, it felt that it was specially prepared for the event, spiced up with all sorts of small issues.

There were a few areas where we could go deeper than the surface and explore the logic of the business, and time, again, was not on our side. The usability aspects could also be improved, the application targeted dedicated sales persons as main users, but the ramp up could be significantly reduced with a few standard improvements (hints and help, fluent screens navigation).

A key aspect of the contest was the test report document that we needed to submit at the end of the session. This could be prepared in advance, but how much can you anticipate upfront? We were looking at 3 hours of exploratory testing on an application that we did not know. The content of a test report is always debatable, there are templates that capture what you did along with what you achieved, but in the end, you just need to know your audience and tailor your data on their expectations. We chose to focus explicitly on the deliverable of the session, our feedback and bug statistics, and kept it as simple as possible. We generated 4 slides for our report: one with our overall feedback on the application; one with the results on the features that we tested (number of bugs, time spent on each area and explicit feedback for each section); one dedicated to bugs, with a graph that highlighted their severity and their type (functional, performance, UI); and for the last one, we included our test environment, because even if not requested, we thought it is relevant to reveal what browsers and what devices we used in our testing. We did not give more details than this, and if there is one thing that we were happy with after the event, it was the report.

Looking back, and ignoring that sound voice that keeps saying “you’re the best”, we wonder what really made us win. We are no testing experts, and will never pretend to be. Testing community is so overwhelming, you always find people more passionate than you, more creative and sharper than you’ll ever be. It’s a job where quick thinking makes a difference and where you need to differentiate yourself continuously. As software testing professionals, you just need to do it better, smarter and faster than the rest.

In the heat of the moment, the time pressure made us doubt of our results. We felt uncomfortable and insecure of our output. We missed the security of being structured, being aware of the coverage, and being able to take calculated decisions. Every now and then, we’d breathe a little and bounce an idea around. We kept a few flexible boundaries and had 2 seconds checks if we are ok with what we are doing. There isn’t much you can put in place in such a short time and it’s important to trust each other, and follow what we feel is right.

Everything was fast paced that day, and even if traditionally testing requires extensive work, we needed to find new ways to deliver solid results in a very short time span.

Coming back to the contest, and observing what others shared online about their experience, we discovered that we were not the team with the highest number of defects, nor with the most critical ones.  We didn’t even win any of the special prizes. So how did we really win? It is a question that wanders in our minds from time to time, and one answer that we all believe in is that when you are into something, it shows!

We come out of this first competition having learned two things. The first one is that we are spoiled in our day to day jobs and we don’t even know it. We have so much time to build our testing strategy, to be able to analyze the product in enough depth. And the second one is the importance of being able to trust and easily communicate with your team mates during critical situations, without these we would have wasted precious time and never met our deadline.

Most importantly, the Software Testing World Cup preliminaries opened our appetite for this kind of exercises; it made us reflect on the areas where we would like to deepen our knowledge and most certainly enriched our testing style.

Our team will take part in November in the finals of this competition. Regardless of the result, we feel like we’ve already won and we would like to encourage all testing professionals to enroll in this type of contests, where the experience gained is much more valuable than the prizes or the ranking!

VIDEO: ISSUE 109 LAUNCH EVENT

Sponsors

  • Accenture
  • BT Code Crafters
  • Accesa
  • Bosch
  • Betfair
  • MHP
  • BoatyardX
  • .msg systems
  • P3 group
  • Ing Hubs
  • Cognizant Softvision
  • Colors in projects

VIDEO: ISSUE 109 LAUNCH EVENT