TSM - Performance testing from Waterfall to Agile

Claudiu Gorgan - Senior Delivery Service Engineer

Like any other success story, when it comes to Performance Testing our story mixes people and processes altogether. Different companies might be built upon different cultures so they implement various mixtures of people and processes – versed teams that have dedicated performance testers might use only guidelines as a process while the more agile teams which rotate the performance tester role between team members, would probably need more detailed processes, checklists, so that the entire performance testing flow would be consistent from one sprint to another and the results offer the same level of confidence.

The next few lines will focus more on the new Agile like workflow but will also highlight the strong base of the entire process which has been built across many years and involved many skilled and experiences people.

The waterfall bit

In many cases, the performance testing process would have looked like the one below: a performance testing team (perfqa team) that was giving the final signoff before a product went live:

Even though the perfqa team would have been involved in the early stages of a product, through the perfqa kick-off, we were not being part of the sprints, not sitting next to the developers and not even led by the same delivery manager that was driving the product implementation and the development team. All the above have generated a few inconsistencies with the agile workflow that wasguiding the development teams:

The agile wonder

At some point in time, the perfqa process started to change and tried to solve the above issues. As such, the performance champions concept has been born. This whole new thing was not something extraordinary; it was more like a community whose members were developers or testers from different delivery team. These performance champions were now being directly involved in the perfqa workflow, thus bringing the performance testing and performance testing responsibility closer to the delivery teams whilst the original perfqa team was mentoring, coaching and training the performance champions.

As a supporter – and in some sort driver – of the performance champions concept, I will present a few of my thoughts like Clint Eastwood would have presented them:

The good

As development, functional testing, project management, business analyst, devops are all part of the team; adding the performance testing to this would close the loop making the delivery team fully responsible of the product they are delivering – which is also empowering for people.

Performance testing can now be part of the sprint planning, and can be managed however it suits better for each team.

New challenges are now available, which will help people to expand the field of expertise – performance testing strategies, tools, tuning and monitoring.

The bad

It is worth considering the difference between an expert on a certain field (ie: performance testing expert) and a more versatile person who is responsible for different technologies (ie: development and performance testing or functional testing and performance testing).

The ugly

As performance testing resides now in each team’s responsibility, they will eventually adapt it for their own needs and believes – which might not sound as an ugly thing but after all, at the end of the day we would all like to have a consistent understanding over the performance of a component (100 transactions per second tps would mean the same for all of us) and we would all want to have an integrated performing product, not a bunch of great performing components.

Performance testing environments will now need to be maintained by different parts of the organization.

The supporters of the performance champions concept/community would need to, at least try to, change the above – of course we won’t try to change what is already working fine, The good.

The expertise can consist of the following: