During QAI’s Quest conference last week we participated in a workshop aimed to help managers improve their test planning process. The workshop noted that there are several important areas to test planning including:
For starters, the workshop facilitator discussed the importance of test planning, even in lean/agile development shops. Although test planning is imperfect because it involves assumptions and estimates in areas where too many variables lie outside our control, the act of planning provides teams with insight into their needs, constraints, and capabilities, thus defining a road map of activities we can manage to.
One of the early points made was test teams tend to develop the objectives of a test effort without truly considering or understanding the needs of the business. That is, we tend to focus on what we think we can/should test and don’t account for how well those items might align with the business’ objectives. For example, some common test objectives might include the following:
Honestly, there is nothing wrong with any of these test objectives, but the question is, are they important to the business goals? Or, by focusing on these objectives, do we miss some level of testing that the business would find extremely helpful?
A lack of alignment between test objectives and business needs tends to occur because test managers aren’t invited or don’t attend early discussions with the business–or business representative–so they must guess what is important, often falling back to “default” or boilerplate objectives. But wouldn’t it be better to know what the business wants out of the testing and ensure that our test effort emphasizes those areas? Of course it would be better. With that being said, we are the test experts so our opinions count, but it’s better to align our testing focus with the business needs – taking into account or at least having the conversation with the business about items we think should be included in the testing to help them meet their goals.
When planning a test effort, the topic of risk and contingencies always comes up. Sometimes we answer the call of identified risks by listing boilerplate issues we’ve encountered, or have read about, rather than truly considering what risks are likely to impact the project and more specifically, which risks are likely to negatively impact our test effort. For example, some common risks you find in test plans include:
When identifying risks and determining contingencies for inclusion in a test plan, it’s important to go beyond the standard issues and evaluate the ones likely to occur. To do this, it may be helpful to view risks in testing as falling into one or more of the following categories:
Technical risks in testing relate to the technologies used, the experience of testing resources as well as other project resources utilizing those technologies, and even the level of business knowledge from the testers. Essentially they are risks derived from a deficiency in any technical area of the project that could influence the test effort.
Environmental risks in testing efforts are common where limited QA environments exist. For example, there may be no separate QA environment, which forces testers to share an environment with developers. Or the environment may have insufficient data sampling to allow the necessary test scenarios to be executed for testing. Another possible issue related to environmental risks is related to test automation or other projects’ use of the environment. This can pose data integrity issues for each of the test efforts being executed as data impacted by one set of tests may have been intended for use in another project’s tests.
sufficient input on the overall delivery schedule and, as a result, testing tends to become squeezed as other development activities go over their allotted time—this occurs in both waterfall and iterative methodologies.
As risks are identified, regardless of which category they fall, it is helpful to develop contingencies for each. However, it should be noted that any contingency plan developed for a particular risk needs to align with the probability that the risk will actually occur. In other words, it isn’t necessary to develop an elaborate, extensive, or expensive contingency for every risk, especially if the risk has a low chance of occurring. This is extremely important, since our goal of identifying risks and contingencies as part of the test planning process isn’t merely to make ourselves and others aware of possible concerns we have about the project, but to inform our team, other members of the project team, and management on our plan of action in case the risk becomes reality.
So, for example, we may believe there is a slight chance that our test data will be negatively impacted by other tests executed at the same time. Since this is a small risk, the contingency for it should likewise be limited. In this case, we may identify a “quick and dirty” method whereby data can be assigned to each project thereby reducing the risk of interference. This type of contingency would be preferable to one that calls for the purchase, installation, and setup of an entirely separate QA environment for every project.
On the other hand, if our customers have complained about the performance of our application causing the development team to re-architect the application to boost performance, but we have neither the tools nor capabilities to validate those changes, then this is a huge risk with a significant likelihood of occurring. In this case, the contingency should fully address the risk even if the cost of the plan is higher than normal.
In the end, we need to ensure that we’ve not only identified risks particular to this project—not just boilerplate or “standard” risks—but also suitable contingencies for each of the risks in testing. It will also be helpful to develop a prioritized list of different risk areas so we can ensure we address the highest priority areas early in our testing cycles. We will discuss prioritization techniques in a future blog.
A couple of points that were made regarding identifying business objectives were to make every effort to meet with the business stakeholders early in the project to discuss their long-term and short-term goals, then show them how various test objectives might align with those goals, providing more value to them from the test effort.
For example, in speaking to the business stakeholder you discover their primary reason for moving forward with this project: the company has taken a hit on customer satisfaction surveys with two primary complaints being the look and feel of the user interface and overall response times associated with customer interaction with the system. As a result, you might be inclined to include usability testing and performance testing as part of your test objectives for this test, as well as the general objectives of finding defects and ensuring a “high-quality” product, etc. Now the objectives of the test effort support the needs of the business and we can focus on what’s important.
By doing this we also have an opportunity to evaluate our ability to deliver the identified test objectives. For example, maybe in the past we didn’t include objectives related to usability or performance testing because we didn’t have a resource that specializes, or is capable of, developing and executing those type tests. Now, we have a picture of not only what our deficiencies are, but the realization those deficiencies may impact the success of the project–from the business’ perspective–if they are not addressed. By documenting these objectives and the subsequent risks, and with contingencies identified, we have an improved chance of meeting the needs of the business and providing real value from our test effort. And isn’t that the reason we’re all here in the first place?
Up next:
Poor reports often lead to disastrous scenarios, such as loss of revenue and trust between testers, developers, and even stakeholders. Watch this free webinar for useful tips on how to better communicate risk with developers and stakeholders to build and maintain trust. Please apply password, “ZenBox284” to watch.