Testing software with poor requirements is clearly undesirable. It is often the sign of an immature software development process, especially with small start-ups. If it is an ongoing situation, it represents a serious risk to the organization’s viability. Testers, however, are in a position to improve this situation by applying best practices, working to understand what the requirements lack, and seeing that test cases and documentation fill in the gaps.
When Requirements Go Bad
If there is still time for review of the requirements, use the following criteria to analyze possible defects:
- Missing requirements are the hardest to nail down because there is nothing already on paper with which to work. There might be obvious missing functionality, but missing error conditions or data types or allowance for extensibility are typical issues that get overlooked.
- Incomplete or ambiguous requirements are a common occurrence. Most of these happen for two reasons:
- Expressing precise requirements in any natural language is difficult at best.
- The author makes unshared assumptions about the details behind the requirement.
- Conflicting requirements often occur because the requirements were written by multiple authors who failed to fully communicate with one another. For example, this situation can lead to related software components reacting to a single user action, such as updating a record, in ways that produce inconsistent results.
Doubling Down on Testing Best Practices
Naturally, you should always apply appropriate best practices to any testing task, but in the context of ill-formed requirements, key tasks are even more important:
- Shift testing leftward in the development process as much as possible. Doing so places you closer to the people who wrote or at least have a better understanding of the unwritten requirements than you do. This is especially true if the project is adding functionality to an existing product where the domain knowledge is fresh and based on experience.
- Especially if there is an existing version of the system being built, lean heavily on exploratory testing to understand how the system works and to assist you in developing a test strategy. This strategy should include both what to test and what not to test. Document your effort thoroughly and share the findings often with other staff.
- Write detailed test cases and an overall test plan. Done well, these documents can enhance the actual requirements, if any, by specifying expected functionality and behavior. Wire-framing is an excellent way to both collect data from others on expected behavior and to document how things should work. Wire-frames can elicit many “Aha!” moments from architects, designers, developers and other stakeholders.
- Seek out domain experts within the company, or outside if you must, who can shed light on performance and functional requirements that would be expected in the market for your particular application or system under test. Your business analysts may already have a good handle on this, but have yet to document them thoroughly.
Staying in the Loop
As early as practical, proactively insert yourself into face-to-face, e-mail and any other electronic discussions related to the product’s development. This step is probably already happening for you if the development is employing an agile methodology.
Even in that situation, however, you should not hesitate to ask the “dumb questions” of others in the scrum as these often turn up unsaid assumptions that lead to additional detail or flaws that you, as a tester, need to know. Especially look out for what appear to be conflicting or ambiguous requirements on application behavior.
Avoid a Double Whammy
Organizations that are producing poor requirements often exhibit other bad software development behaviors. Perhaps the most important of these is failing to implement competent change management.
If your organization’s change management practices or CM software are inadequate, take up the banner to implement a more useful system. Once you have change management in place, it will be far easier to grasp the project requirements as the development progresses and to tie specific test documents and scripts to specific revisions. Without good CM, your testing effort may insert even more ambiguity and conflict into the understanding of the project requirements.
Working within a project or organization that spends too little effort developing sound software requirements can be stressful, but it need not cause you to throw in the towel. While it is a situation best rectified earlier in the development process, making improvements at the testing phase is possible and can be quite effective. View it as an opportunity to exercise your analytical and problem-solving skills as well as a chance to finesse your communication talents.