We are wrapping up our software testing course using a real-time project, welcome to the last section.
During this session, we will examine defects and delve further into remaining topics to finalize the Test Execution phase of the Software Testing Life Cycle (STLC).
Recommended IPTV Service Providers
- IPTVGREAT – Rating 4.8/5 ( 600+ Reviews )
- IPTVRESALE – Rating 5/5 ( 200+ Reviews )
- IPTVGANG – Rating 4.7/5 ( 1200+ Reviews )
- IPTVUNLOCK – Rating 5/5 ( 65 Reviews )
- IPTVFOLLOW -Rating 5/5 ( 48 Reviews )
- IPTVTOPS – Rating 5/5 ( 43 Reviews )
As you may remember from the prior article, throughout the Test Execution, we encountered instances where test cases did not produce the expected results and we observed unforeseen behavior during Exploratory Testing.
How do we react to deviations?
Documentation and monitoring of these deviations are crucial to ascertain they are resolved in the application under test (AUT).
#1) Such deviations are frequently designated as flaws, bugs, issues, incidents, errors, or faults.
#2) Defects can include the following cases:
- Missing specifications
- Faulty performing specifications
- Additional specifications
- Inconsistencies in reference documents
- Environment-related problems
- Improvement suggestions
#3) Flaw documentation generally involves the use of excel spreadsheets or specialized Flaw Management software/tools. For details on defect handling using specific tools, visit the following links:
- HP ALM
- Atlassian JIRA
- Also check this post for a list of primarily used Bug Tracking tools on the market.
Learning Objectives:
Effectively Logging Flaws
In this section, we will discuss how we can log the defects we found in the prior article using an excel spreadsheet. It’s imperative to use a standard format or template to fulfill this requirement.
The Flaw Report typically consists of the following columns:
- Defect ID: An identifying marker for unique tracing.
- Flaw Description: A concise title describing the problem.
- Module/Section of the AUT: An optional field indicating the area of the AUT where the defect occurred.
- Steps to Replicate: The precise sequence of activities required to recreate the defect on the AUT, including specific input data related to the problem.
- Severity: Quantifying the problem’s magnitude and its potential effect on the AUT’s operation. Guidelines for assigning this field can be found in the test plan document. For more information, please visit Article 3: Writing a Test Plan Document.
- Status: This will be discussed in depth later in this article
- Screenshot: A snapshot of the application displaying the error.
Though these are critical fields to include, the template’s specifics can be enhanced or minimized according to the needs. For instance, the template could include the name of the tester reporting the problem or exclude the module name if it’s not essential.
Using the aforementioned guidelines and template, a sample Flaw Log/Report could resemble this:
Sample Flaw Report for the OrangeHRM Live project:
=> To download the live project Flaw Report, click here
Here’s a sample Flaw Report created using the qTest Test Management tool: (Click on the image for a full view)
Merely logging defects is not sufficient, we also need to assign them to appropriate teams for action. The task allotment process, which includes to whom and in what order to assign, is typically stated in the test plan document.
Flaw Cycle:
As displayed above, defects go through different stages and include multiple stakeholders during the identification and resolution process. To maintain transparency and keep track of the status of each defect, a “Status” field is incorporated in the defect report. The entire process is recognized as the “Defect life cycle.” For a detailed explanation of the different statuses and their definitions, kindly check this Bug Life Cycle tutorial.
Recommendations for Defect Tracking
- When involved in a novel Team/Project/AUT, it is preferable to discuss detected issues with colleagues to have a better understanding of what constitutes a defect.
- Ensure all required information is included to reproduce the problem. It does not create a good impression when a defect is returned to the testing team with a status of “Not sufficient information.” Check this post on How to Have Your All Issues Resolved Without Any “Invalid Issue” Label
- Check if the same problem has already been reported before raising a new one. Duplicate problems also negatively affect the QA team.
- If an issue arises randomly and the exact operations or conditions are unknown to reproduce it, still report the problem. It’s better to handle all possible failures to the fullest extent possible, even if the issue might be tagged as “IrReproducible” or “Not enough information.”
- It’s common for a QA team to create a defect log in excel sheets during the day and consolidate them at the end of it.
The Full Flaw Life Cycle
If we follow the flaw life cycle for Flaw 1 with our live project:
- As the tester, when I create the defect, its status indicates “New.” When I allocate it to the QA team lead, the status remains “New,” but now the owner is the QA lead.
- The QA lead reviews the issue, and upon confirming it as a valid one, assigns it to the Dev lead. In this phase, the status becomes “Assigned,” and the owner is the Dev lead.
- The Dev lead then entrusts the defect to a developer who will address and rectify it. The status changes to “Work in Progress” or something similar, and the owner becomes the developer.
- Concerning Flaw 1, if the developer fails to reproduce the error, they direct it back to the QA team with a status of “Unable to reproduce.”
- On the other hand, if the developer can rectify the problem, the status turns to “Resolved,” and now the QA team deals with it.
- The QA team picks up the issue, retests it, and if it’s resolved, the status is set to “Closed.” If the problem still exists, the status is set to “Reopen,” and the process repeats.
- Depending on the situation, the developer may set the status as “Deferred,” “Insufficient Information,” “Duplicate,” “Working Correctly,” or other.
- During the test execution phase, the QA team members make it a daily practice to record, report, assign, and manage defects.
- Once Cycle 1 is fulfilled, the dev team takes a day or two to compile all the fixes and prepare the next version of the software for the next test cycle.
- This process repeats for Cycle 2, and at the end of this cycle, there may still be open or unaddressed issues in the software.
- Now at this point, we analyze the “Exit Criteria” to conclude whether the testing process should be terminated after Cycle 2 or be continued with another cycle. The Exit Criteria are predetermined in the Test Plan document. They consist of a checklist to help with the decision. Here is an example of a checklist for the OrangeHRM project:
Testing Measurements
During the Test Execution phase, it’s essential to provide updates on the progress of QA activities to all concerned stakeholders. Plain numbers, like the number of test cases passed or executed, might not paint a complete picture of the overall quality.
Hence, we have to turn to metrics to bridge this gap. Metrics are intelligent numbers collated and maintained by the testing team. For instance, instead of declaring 150 test cases were successful, it makes more sense to say 90% of test cases were successful.
Several types of metrics need to be obtained in the test execution phase. The specific metrics required and their maintenance periods are outlined in the test plan document.
Here are some standard test metrics collected:
- Percentage of Test Case Success
- Defect Density
- Percentage of Critical Defects
- Number of Defects by Severity
Refer to the enclosed Status Report to understand how these metrics are utilized.
Test Endorsement/Completion Report
Just as it was crucial to notify all stakeholders when testing begins, it is also the responsibility of the QA team to inform when testing is over, and to share the results. Commonly, the QA team lead or QA manager sends an email signifying on behalf of the QA team the approval of the product. The email contains the test results and a list of open or known issues.
Sample Test Endorsement Email:
To: Client, Project Manager, Development Team, Database Team, Business Analyst, QA Team, and Environment Team (and any other relevant recipients)
Email: Dear Team,
The QA team endorses the OrangeHRM version 3.0 software after successfully completing 2 cycles of functional testing on the website.
The test cases and their execution results are attached in this email or can be located at the specified location. (If using test management software, include relevant details.)
The list of known issues is attached in this email. (Add any extra references if applicable.)
Regards,
QA Team Lead
Attachments: Final Execution Report, Final Issue / Flaw Report, Known Issues List</em
Upon sending the Test Endorsement email, we have formally completed the STLC process. However, this does not mark the completion of the overall “Test” phase of the Software Development Life Cycle (SDLC). User Acceptance Testing (UAT) is yet to be carried out. For more information on UAT testing, kindly refer to this article.
After stage UAT is over, the SDLC moves to the deployment stage, during which the application goes live and becomes available to users and customers.
And it’s a wrap!
We trust this series has made it possible for you to have a realistic experience with a QA project. We encourage you to share your comments, questions, or any advice about this free software testing QA training series online.