In an article here on TSM, earlier this year, I have tried to explain what Quality Assurance is and how it fits into an agile environment.
Today I will move forward and describe the first steps in defining and implementing a Quality Management process in a software development project, be it agile or not.
As any other managerial activity, it should start with the goal and a plan detailing:
every activity to be taken in order to reach the goal;
the resources that are needed to fulfill the activities;
the time frame;
monitoring measures - to make sure everything goes into the right direction and it stays on the agreed track
Success criteria and return of investments are also recommended to be part of this plan.
In very few words, the quality of a product or service is what differentiates your work when compared to your competitors', the one thing that makes your product or service standing out from the crowd. Quality should be the ultimate goal when building anything - everything else is just a means (or support tool) to achieve it. Quality is a sum of various quality objectives or criteria that must be clearly stated and agreed upfront, mandatory within the team and, if possible, together with the customer, as early as possible in the project's lifecycle. It goes without saying that these objectives should be SMART (specific, measurable, achievable, relevant and time-bound) in order to be useful.
Quality objectives are useful tools for the project team to evaluate and improve its own way of working and its work product before it is completed and before any eventual defects or inconsistencies are delivered to the customer.
They are not to be confused with the larger group of the non-functional requirements, also referred as quality attributes. NFR's are supposed to address (either explicit or not!) an user need, such as uptime or number of concurrent users supported by a web application - that is subject to the project's architecture and infrastructure management.
Some examples of such quality objectives are:
"reduce number of bugs by 25% during the next quarter" - assuming that the number of bugs reported during previous quarter is known;
"keep re-work time below 25% of the total time spent of development in every sprint";
An example of a bad quality objective could be something like "Reduce number of bugs by 25% in every sprint", as it is not really achievable or time-bound - see the first example above!
What activities are to be taken in order to reach the goal(s) defined by the quality objectives? They can be grouped into 3 categories: preventive actions; appraisal activities and fix/repair activities.
Preventive actions include trainings - so your staff will perform better, maximizing tool and their knowledge usage; R&D activities - so your team will stay on the edge of their field of work; planning; monitoring and measurement activities - so you will be able to determine deviations, observe trends and take corrective actions before inconsistencies in your work product or way of working turn into issues.
Appraisal is an objective evaluation or check of something against an agreed set of standards, expectations or model. Appraisal activities include reviews, tests and audits.
Any work product of a team can be appraised: requirements, the product itself, any plans or documents created during the project's lifecycle as well as the activities performed by the team on its daily routine (that ends in the physical product). Requirement documents, architecture, configuration management can be the subject of an appraisal! This way the team is confident they are build a correct and complete product and it ensures the customer or the relevant stakeholder that their business needs are addressed and well implemented. Performed work processes can benefit too from such objective evaluations!
If review and test are well known activities, audits are a bit different: they must be done by a trained, independent auditor, outside the project team (i.e. a peer/senior software architect for project's software architecture; an external consultant if you want to evaluate your work against an industry standards such as ISO standards and so on). Audits are to be carefully planned and the scope and the objectives should be known and agreed upfront. The objectives can be internally defined (either following an existing model -such as CMMI - or not) or a set of industry standards. The audit findings must be formalized into a report at the end of the process and acknowledged by the relevant stakeholders. The findings, either inconsistencies, non-compliances (in respect to the agreed standard or set of rules) or good practices, can be transformed into action points for the team (with an owner and deadline assigned) and re-evaluated in a follow up session if necessary.
Fixing activities include all effort that is needed to repair the defects found in the product, before or after the delivery. In our industry it is better known as "bug fixing", but it should NOT be limited to this! The way of working can be improved if necessary; the team structure may need to be changed and so on. Of course, any change should be preceded by a thorough analyze - you don't want the cost of the change to exceed your budged or find out it was not as useful as expected or, even worse, not necessary.
There are 3 "actors" within a team that coexist and each one has an influence on the dynamic of the body that is a team: people (their availability, skills and knowledge), technology (that supports and enables people to do their work efficiently) and work processes (how people are using their knowledge and the technology available in their day-to-day activities). At any given moment in a project's lifecycle there should be a fine balance between those 3 "ingredients". Of course, as for any resource, they come with a cost that should be taken into consideration - at least to make sure the expenses are not exceeding the estimated gain from the project or the product that is built.
When it comes to quality, its "cost" (the CoQ) can be summarized to this formula:
CoQ = CoP + CoN + CoA
Where:
CoP is the cost of prevention resources and activities: hardware or software upgrades, trainings, planning and analysis activities etc.
CoN is the cost of non-conformities: effort needed to fix defects, re-testing and reworking / refactoring your product; effort needed deliver the fixes to your product; perhaps a 24/7 support team is needed for your customer in case your product is critical for its business and so on.
A successful quality assurance plan will aim to keep the CoQ within an acceptable range in the overall project's budget by keeping an optimal balance between CoP and CoA so that the CoN stays as low as possible.
The cost of quality assurance activities can be supported internally by your organization or can be shared with the customer - assuming you've done your homework and the sales department can convince the client it is in their best interest. Having a proven track of the success of the QA activities can help the sales department win a bid (i.e. investing time or money in appraisal activities can lead to a more reliable product and an overall decrease of the total project cost as without the appraisal). More on measurements and monitoring activities will be detailed later in this article.
Some of the activities described above are "one-off" or time-bound, some are to be continuously performed during the project's lifecycle.
One-off activities include contract negotiation, architecture design or initial (high-level) planning and estimation. From the quality assurance perspective, defining project quality objectives, defining measurement procedures and collecting frequency, identifying (technical) skills needed to complete the work versus what's available (in order to plan some trainings, if necessary) are examples of one-off activities. Once such documents are available and approved by the relevant stakeholders it is unlikely to be changed. Or, to be more accurate, they are not changed very often. Yet they should be reviewed! The frequency and the trigger of a review session should be clearly described: a contract can be reviewed / renewed yearly or in case of force majeure. Software architecture or the important milestones described in high-level planning can be updated as well, and this must trigger at least a review (if not an update) of all managerial plans, making sure they are still aligned to the business or project objectives.
Review, verification, validation and monitoring activities should be performed continuously, on all work products, be it requirements, software packages, user manuals or working processes. Weekly or per sprint (in Scrum teams), it does not make any difference, but they should be always in focus of the entire team.
Audit sessions should be described as well. Not the exact date and time, but the frequency (yearly.. quarterly.. monthly.. once per major milestone..) and the set of rules and procedures that apply.
Once the quality objectives are defined and the draft plans are available, the team and upper management (or any other relevant stakeholders) commit to them and they become goals for the project team. It is the responsibility of the project manager, or another team member specially assigned, to make sure they're reached within the defined time frame.
To properly monitor the activities -and make sure the team stays on the agreed track- a set of measurements is to be defined, specific for each quality objective, plus the frequency of collecting and analyzing measurements data. The plethora of monitoring and reporting tools available is outstanding: starting with Jira or TFS for time and issue management and going all the way to the good old Excel worksheet for consolidation and graphs.
Fig. 1 - An extended Jira workflow scheme
The issue type schema and the workflow -the status and possible transitions of an issue- in (e.g.) Jira has to be adjusted to reflect the quality objectives defined and agreed upon. You will be able to quickly identify, count and filter those quality-related activities and their associated costs! Encourage logging of all estimations upfront and monitor the working progress throughout the project lifecycle to identify deviations, trends and use this to improve your predictability. Make sure that issues are logged with the correct issue type so that the source of any issue can be traced: was it found during review, test or, in the worst case scenario, by the client? The number of non-conformities (or quality issues) and time spent on analyze and fix them can be monitored apart from the "normal" project activities and this can tell a lot of information about the overall project quality. This information can be displayed in a nice graph so the trends can be easily observed. Based on these trends, observed good practices can be used by the team to set future objectives and also promoted to other projects or to the organizational level while practices that cause an increased number of non-conformities can be addressed in a timely manner and their usage stopped before causing real issues.
Fig. 2 - Different graphical representation of measurements data
Furthermore, every document and activities described in this article can be reused in a new project and any good practices identified can become the norm or internal standard and used by anyone across the organization. The measurements collected will make a strong argument in favor of any observed good practice when it comes to implement it into another project or they'll make a good case-study to present to a potential client. The reciprocal is also true: measurements will make easier to convince any person or team to get rid of any bad habits if the outcome is not as good as expected! Just keep in mind that the measurements data collected is to be used as a tool or baseline to improve the way of working of the entire team and not as a tool to punish individuals!
It is the moment of reflection when one looks back and evaluates the journey so far. It must be planned at regular intervals or after reaching a major milestone; waiting for the project's end is not recommended, as there's little to be done in case of a "malfunction". Lessons learned during the time passed from the previous retrospective must be captured and can be used to improve the quality management plan in all its aspects: quality objectives; resources; measurements collecting procedures and thresholds; activities and when/how they're performed.
The quality of any work product is of a paramount importance nowadays. The ultimate goal of a software development project must be a quality product, not just to finish the agreed functionality within the agreed time frame and within the agreed budget. In order to keep everyone focused on the product quality, a team must clearly identify the quality objectives and, of course, plan all the related activities. It is an investment that, when completely and correctly done, returns a great value, in terms of both team and customer satisfaction, by reducing the number of possible defects and thus the effort to fix the already completed / shipped product. It should go without saying that a good plan available upfront will save a lot later in the project lifecycle - the team will be proactive, acting towards a goal, not reacting to stimuli, sometimes without having the luxury to properly analyze all the input and possible consequences .