After defining and organizing requirements, managers need to track compliance. This post describes a method used in engineering, but the method helps with any kind of project. The Verification Cross-reference Matrix (VCRM) addresses, or puts you on the path to addressing, many needs.
- Requirements, as documented through surveys, interviews, analyses, contracts, Statements of Work, standards, or regulations often mix requirements with preferences and background information. Before you begin work, you need to isolate the essential information to gain approval of mutual understanding between customers and the planners or designers.
- To determine or reflect the steps of the service or the components of a product (for example, in a Work Breakdown Structure), you need to organize requirements.
- Before starting work, you need the team's agreement that the requirements make sense. For example, are they SMART -- Specific, Measurable, Achievable, Realistic, and Timely? (There are many variations on SMART, and it is only the beginning of the types of tests one can apply to requirements. For example, is the requirement written in active voice? Does each sentence have exactly one requirement? Is each requirement verifiable?)
- The planners or designers need the requirements in a format that allows them to check off each requirement as they put it into effect in the plan, solution architecture, or detailed design.
- Risk analysts need a list of requirements for early what-if analysis.
- Test engineers need to identify parameters that will require measurement. They will need to add requirements to build measuring points and measuring devices into the system, or they will need to obtain test or monitoring equipment.
- Verification requires planning the conduct of testing so that it occurs in a logical order and in coordination with other processes such as project phases, start-up, or burn-in.
- Validation and customer acceptance require relating the results of verification back to the original requirements as evidence of compliance with the terms of the contract and fulfillment of the customer's needs.
- identify every detail that may be affected when a more general requirement changes
- prevent creation of details that the general requirements do not authorize
- ensure that the details fulfill every general requirement
- re-use portions of the product's or service's organization in future projects for cost and schedule estimating and for planning or design.
The VCRMA VCRM lists the requirements of a specification and identifies the method(s) for verifying them. The details included in a VCRM vary with the customer, the phase of the contract, the nature of the contract, and the relationships within the contract.
Simple VCRMs such as the one used by NASA in figure 1 list requirements and the appropriate quality control methods for each requirement. This would more properly be called a “Requirements Verification Matrix.” Detailed VCRMs may include considerably more, such as test ownership, verification requirements, and verification results.
|Figure 1. Simple VCRM from a NASA specification. (Click for larger version.)|
- Inspection includes qualitative observation. "The fruit basket shall include three Granny Smith apples." (You don't need to run DNA tests to verify that.)
- Analysis includes computation or comparison to historical or experimental data.
- Computation: One might have to calculate the power of munitions, since using them destroys them.
- Comparison/Similarity: Since Project A used Widgets, Project B can re-use the historical data to show that the Widgets meet its requirements.
- Modeling and simulation: We have all seen photos and videos of smoke streaming over the wings of model airplanes or cars in wind tunnels. Did you know that scientists can run computerized simulations of galaxies colliding? They get the results a lot quicker than when they wait around billions of years to see what happens.
- Demonstration verifies performance of a function that does not require qualitative measurement.
- Test verifies that a function executes within specified parameters.
VariationsFigure 2 shows an example VCRM from the Department of Transportation that includes the complete text of each requirement, added information about the verifications, and the party responsible for each requirement's verification.
|Figure 2. Example VCRM from the DOT. (Click for larger version.)|
The DOT adds Certification of Compliance as a test method. They don’t define it, but it sounds like a type of Inspection where you inspect a certificate rather than inspecting the product. Since the contract has multiple sellers, Certification of Compliance might refer to using certificates provided by an equipment provider. For example, if the Government furnishes its own equipment for the contractor to install, it may already have verified the equipment’s compliance. Other examples would include calibration certificates or certifications by independent testing labs.
Figure 3 shows an example of a procurement specification that allocates verifications to contractual phases and cross-references performance requirements to verification requirements. Figure 3A shows the table and figure 3B shows verification requirements.
|Figure 3A. VCRM used by U.S. Army Corps of Engineers. (Click for larger version.)|
|Figure 4. Verification requirements used by U.S. Army Corps of Engineers. (Click for larger version.)|
In the world of paper requirements documents, VCRMs occupy an appendix or the beginning of the Quality Assurance section. Each document contains its own VCRM. Simple tables contain the data. However, Requirements and Test engineering should control VCRMs centrally. The reason requires taking a tangent into requirements control.
ToolsMany projects now used a database such as DOORS to control requirements. Databases' storage tends to isolate a sentence from its context, forcing statement of each requirement explicitly and independently. The repetition is a pain, but it ensures design quality by forcing exhaustive identification of requirements and by reducing or identifying ambiguity in the scope.
At one company, a project manager employed a distributed requirements database. He allocated requirements using spreadsheets and distributed them to the various design teams. He reassembled the spreadsheets later.
During the project, multiple teams took ownership of some requirements, while nobody took ownership of others, especially when one team would try to transfer ownership of a requirement to another team. Project management had to play catch-up with ownership changes and the teams had to take corrective actions to address dropped requirements.
The lack of centralized control also resulted in the loss of rationales and lessons learned. A robust requirements database would have retained such information. Even a requirement labeled "not a requirement" would have been retained in previous baselines.
Instead, teams kept records at inconsistent levels of detail and often discarded records after deeming them irrelevant to their own efforts.
A requirements database allows you to trace up and down the layers requirements as they are decomposed. One can use Excel spreadsheets for a simple set of requirements. A larger operation needs tools such as DOORS, RTM, or a database developed in house.
Such tools allow one to verify product scope by ensuring that lower-level specifications satisfy each top-level requirement. The help prevent requirements creep by verifying the contractual authority of each bottom-level requirement. This is called traceability.
One can also take the requirements database to the next level by adding (or linking to it) a Verification layer. The Verification layer identifies the Test plan that verifies each requirement. A robust database could even contain the test documents and results.
Case StudyA friend, Jim Pannunzio, once worked on a contract for a first-of-its-kind product. Since many requirements had high risk of being unachievable, the contract provided incentive fees for accomplishing the high-risk requirements.
The company had not used the right tools, so they had not traced some successful tests to the requirements and had failed to bill the customer. This cost them money. When Jim cross-referenced the verifications to the requirements, he found a number of such requirements. Jim also found requirements that the company had failed to verify. This allowed him to suggest further verification.
By identifying untested requirements and untraced successful verifications, Jim’s attention to detail brought hundreds of thousands of dollars of added revenue to his company and turned the project from an economic failure into a success.
ConclusionTight integration between requirements and verification provided by a VCRM forms a vital link between design and thorough quality control, between contractual provisions and acceptance, and between the buyer’s and seller’s bank accounts.
For more information see:Scukanec, Stephen. A Day in the Life of a Verification Requirement. Northrop Grumman. 28 January 2008. http://sstc-online.org/2009/pdfs/SJS2408.pdf
Scukanec, Stephen, and James van Gaasbeek. A Day in the Life of a Verification Requirement. Northrop Grumman. 24 October 2007. http://www.dtic.mil/ndia/2007systems/Wednesday/AM/Track1/5536_1004_1042.pdf
Figure 1: Goddard Space Flight Center, Greenbelt, Maryland. National Polar-Orbiting Operational Environmental Satellite System (NPOESS) Preparatory Project (NPP), Science Data Segment Requirements Specification, GSFC 429-05-11-01. April 7, 2005. http://nppwww.gsfc.nasa.gov/PEATE/NPP_SDS_Lev3Req.doc. Downloaded 1 June 2011.
Figure 2: Office of Operations, Federal Highway Administration, Department of Transportation. Testing Programs for Transportation Management Systems: A Technical Handbook. Appendix A, Example Verification Cross Reference Matrix. June 20, 2007. http://ops.fhwa.dot.gov/publications/tptms/handbook/app_a.htm. Downloaded 1 June 2011.
Figure 3: Engineer Research And Development Center, Corps Of Engineers, U.S. Army Topographic Engineering Center, Alexandria, VA. Performance Specification: Improved
Position and Azimuth Determining System (IPADS), MIL-PRF-52955C. 8 November 2002. https://aais.ria.army.mil/AAIS/Award_web_03/DAAE2003D01500000/DAAE2002R0177/attach_exhib/spec_attachment_2.pdf. Downloaded 1 June 2011
Figure 4: Baun, Rich. GLAST Large Area Telescope: LAT Pre-Shipment Review: LAT Level Test Verification Process. Gamma-ray Large Area Space Telescope. 27 April 2006. http://www.slac.stanford.edu/exp/glast/flight/rdm/livepreship/04_Test_Verification_Process.pdf. Downloaded 1 June 2011.
Copyright 2011, 2013, Richard Wheeler -- Permission granted for non-profit or personal use.