Search This Blog

19 June 2011

Gaming HR's Job Site Limits

Multi-talented people who qualify for multiple positions within large companies have a new reason to bypass HR and use networking to reach hiring managers. Large companies with online application systems create this situation by limiting the number of cover letter and resume versions applicants can use. This severely limits the number of positions to which one can apply.

For example, I applied for an Industrial Engineer (IE) position with a major Defense contractor. Their system stores only one version of an applicant's resume. When I applied for a Systems Engineer position, the system could only use the resume and cover letter I had crafted for the IE position.
  • Sidebar: That's Systems Engineer, as in multi-disciplined, end-to-end technical managers of projects. IT Systems Engineers and specialists in other disciplines are generally not Systems Engineers, even though many companies give them the title.
This practice allegedly limits the ability of resume-spammers to apply for jobs for which they do not qualify. Counter-productively, it also limits companies' access to high value, multi-talented people. An applicant could apply for multiple jobs under the old mail-in paper filing system, but poorly implemented technology places a governor on the flow of applications.

When HR looks at my application for the Systems Engineering job in two weeks, they'll see the cover letter and resume for the IE job and say I'm unqualified. In fact, they'll ask, "Why's this IE applying for a Systems Engineering position?"

Had I replaced the IE version with my Systems Engineering version, the same thing will happen. In fact, when the IE hiring manager goes to retrieve the IE versions of my cover letter and resume, they will no longer exist in the database.

Even worse, if I applied for all the positions I have the skills for, HR might consider me a job-spammer and ban me from the whole system.

Using a generic version does not solve the problem. For many reasons, generic resumes do not compete with tailored, targeted versions.

The systems severely limit how often you can apply for jobs, even if you're fully qualified for each one.

I may have found a way around this limitation for some job sites. Some offer a way to upload supporting documents such as certificates, transcripts, copies of licenses, etc. One can upload a file containing a tailored cover letter and resume. My hypothesis: If the primary resume in the database gets past HR's screeners, the hiring manager will see your "supporting documentation."

Game HR's web application system if you must. Better yet, bypass HR through networking.

15 June 2011

Systems Engineers, IT, and Sanitation Engineers

I feel a strong kinship to the civil engineer who specializes in garbage removal and landfills when PC bureaucrats call garbage collectors Sanitation Engineers.

The IT industry labels just about anybody a Systems Engineer. They demean the title by giving it to people who have skills in computer networks, software coding, or server administration, but may have little or no education in engineering, no multidisciplinary background, and no high-level perspective of systems development. I respect the IT folks knowledge and the hard work they put in to acquire their expertise, but that doesn't make them engineers, let alone systems engineers.

Systems Engineers work both at detail and at higher design levels. They consider business needs, requirements analysis, risk management, cost estimation, project planning, use cases, life cycle analysis, design, integration, and verification and testing. Their broader perspective requires formal, multidisciplined education and extensive experience so they can communicate with and bring together business, management, and engineers from different disciplines to communicate with the customer and deliver a complete and coherent design or service.

Two guilty parties who lead the title inflation include Microsoft, with its Microsoft Certified Systems Engineer certification, and Cisco Systems, with its Cisco Certified Network Engineer certification. MCSEs and CCNEs know their subjects, but passing a test qualifies them as technicians or technologists, not as engineers who devoted four or more years to passing hundreds of tests in dozens of subjects.

Some say the usurpation of the SE title is a non-issue. As a job-hunter, I disagree strongly.

The mis-labeling has wasted hundreds of hours of my life over the past 20 months. I have to access and scan dozens of job descriptions for each systems engineering position I find.

Example:  Sherlock Tech knowingly inflates a Systems Administrator into a Systems Engineer. You're full of you-know-what, Sherlock.

When you multiply that by having to read dozens of systems engineering job descriptions to find one position for which I qualify, you find a very large, tedious, and discouraging task.

Many disciplines such as hydraulics, controls, mechanical, optical, power, and manufacturing engineering -- not just IT -- usurp the term. In their case, the mis-labeling constitutes a lateral misplacement. The problem seems not merely due to inflation, but also due to laziness. This compounds the challenge of job searching for SEs, but also for job hunters in those other disciplines.

The Department of Labor and state employment departments have contributed by cataloging some technical occupations by discipline and others by industry. For example, aerospace and IT are not disciplines; they are industries. Computer systems, networks, software, and (cross-functional) are disciplines.

When I was in school, we already had a term and major for students of IT. It was Computer Systems Engineering. (Some schools lagged in separating out the computer majors from Electronic Engineering, or Electronic Engineering from Electrical Engineering, too.) Software Engineering had just starting to break out as a separate discipline.

A little specificity would go a long way in the job market.

03 June 2011

The Verification Cross-reference Matrix (VCRM)


After defining and organizing requirements, managers need to track compliance.  This post describes a method used in engineering, but the method helps with any kind of project. The Verification Cross-reference Matrix (VCRM) addresses, or puts you on the path to addressing, many needs.
  1. Requirements, as documented through surveys, interviews, analyses, contracts, Statements of Work, standards, or regulations often mix requirements with preferences and background information. Before you begin work, you need to isolate the essential information to gain approval of mutual understanding between customers and the planners or designers.
  2. To determine or reflect the steps of the service or the components of a product (for example, in a Work Breakdown Structure), you need to organize requirements.
  3. Before starting work, you need the team's agreement that the requirements make sense. For example, are they SMART -- Specific, Measurable, Achievable, Realistic, and Timely? (There are many variations on SMART, and it is only the beginning of the types of tests one can apply to requirements. For example, is the requirement written in active voice? Does each sentence have exactly one requirement? Is each requirement verifiable?)
  4. The planners or designers need the requirements in a format that allows them to check off each requirement as they put it into effect in the plan, solution architecture, or detailed design.
  5. Risk analysts need a list of requirements for early what-if analysis.
  6. Test engineers need to identify parameters that will require measurement. They will need to add requirements to build measuring points and measuring devices into the system, or they will need to obtain test or monitoring equipment.
  7. Verification requires planning the conduct of testing so that it occurs in a logical order and in coordination with other processes such as project phases, start-up, or burn-in.
  8. Validation and customer acceptance require relating the results of verification back to the original requirements as evidence of compliance with the terms of the contract and fulfillment of the customer's needs.
This post provides a summary of traditional methods that support the above needs.  Each topic above could lead to another post, or even books. When exploring this area, one should consider another tangential topic, requirements traceability. Traceability allows creating threads showing linkage of the details to the general requirements. This allows management to
  • identify every detail that may be affected when a more general requirement changes
  • prevent creation of details that the general requirements do not authorize
  • ensure that the details fulfill every general requirement
  • re-use portions of the product's or service's organization in future projects for cost and schedule estimating and for planning or design.


A VCRM lists the requirements of a specification and identifies the method(s) for verifying them. The details included in a VCRM vary with the customer, the phase of the contract, the nature of the contract, and the relationships within the contract.

Simple VCRMs such as the one used by NASA in figure 1 list requirements and the appropriate quality control methods for each requirement. This would more properly be called a “Requirements Verification Matrix.” Detailed VCRMs may include considerably more, such as test ownership, verification requirements, and verification results.
Figure 1. Simple VCRM from a NASA specification. (Click for larger version.)
The QC methods normally include Inspection, Analysis, Demonstration, and Test (IADT).
  • Inspection includes qualitative observation. "The fruit basket shall include three Granny Smith apples." (You don't need to run DNA tests to verify that.)
  • Analysis includes computation or comparison to historical or experimental data.
    • Computation: One might have to calculate the power of munitions, since using them destroys them.
    • Comparison/Similarity: Since Project A used Widgets, Project B can re-use the historical data to show that the Widgets meet its requirements.
    • Modeling and simulation: We have all seen photos and videos of smoke streaming over the wings of model airplanes or cars in wind tunnels. Did you know that scientists can run computerized simulations of galaxies colliding? They get the results a lot quicker than when they wait around billions of years to see what happens.
  • Demonstration verifies performance of a function that does not require qualitative measurement.
  • Test verifies that a function executes within specified parameters.


Figure 2 shows an example VCRM from the Department of Transportation that includes the complete text of each requirement, added information about the verifications, and the party responsible for each requirement's verification.
Figure 2. Example VCRM from the DOT. (Click for larger version.)
The DOT calls what NASA used (figure 1), a Requirements Verification Matrix. The simpler table does not really cross-reference the requirements to anything else. The DOT adds part of the Test Plan (columns 4-6) before calling it a VCRM.

The DOT adds Certification of Compliance as a test method. They don’t define it, but it sounds like a type of Inspection where you inspect a certificate rather than inspecting the product. Since the contract has multiple sellers, Certification of Compliance might refer to using certificates provided by an equipment provider. For example, if the Government furnishes its own equipment for the contractor to install, it may already have verified the equipment’s compliance. Other examples would include calibration certificates or certifications by independent testing labs.

Figure 3 shows an example of a procurement specification that allocates verifications to contractual phases and cross-references performance requirements to verification requirements. Figure 3A shows the table and figure 3B shows verification requirements.
Figure 3A. VCRM used by U.S. Army Corps of Engineers. (Click for larger version.)
Figure 3B. Verification requirements used by U.S. Army Corps of Engineers. (Click for larger version.)

Figure 4 includes a “VCRM” that documents not only the verification plan, but also the data that determines compliance. This goes beyond calling it a “VCRM.” It should, perhaps, be called a Verification Report.
Figure 4. Verification requirements used by U.S. Army Corps of Engineers. (Click for larger version.)
Forcing design and requirements engineers to identify verification methods protects both Buyer and Seller by ensuring that the requirements are verifiable. By definition, one cannot verify an unverifiable requirement. Without objective verification criteria, a Seller can falsely claim to have fulfilled the contract or a Buyer can claim the Seller did not fulfill the contract. Including VCRMs in specifications allows general agreement about requirements verification methods and prevents problems during product acceptance and at contract closure.

In the world of paper requirements documents, VCRMs occupy an appendix or the beginning of the Quality Assurance section. Each document contains its own VCRM. Simple tables contain the data. However, Requirements and Test engineering should control VCRMs centrally. The reason requires taking a tangent into requirements control.


Many projects now used a database such as DOORS to control requirements. Databases' storage tends to isolate a sentence from its context, forcing statement of each requirement explicitly and independently. The repetition is a pain, but it ensures design quality by forcing exhaustive identification of requirements and by reducing or identifying ambiguity in the scope.

At one company, a project manager employed a distributed requirements database. He allocated requirements using spreadsheets and distributed them to the various design teams. He reassembled the spreadsheets later.

During the project, multiple teams took ownership of some requirements, while nobody took ownership of others, especially when one team would try to transfer ownership of a requirement to another team. Project management had to play catch-up with ownership changes and the teams had to take corrective actions to address dropped requirements.

The lack of centralized control also resulted in the loss of rationales and lessons learned. A robust requirements database would have retained such information. Even a requirement labeled "not a requirement" would have been retained in previous baselines.

Instead, teams kept records at inconsistent levels of detail and often discarded records after deeming them irrelevant to their own efforts.

A requirements database allows you to trace up and down the layers requirements as they are decomposed. One can use Excel spreadsheets for a simple set of requirements. A larger operation needs tools such as DOORS, RTM, or a database developed in house.

Such tools allow one to verify product scope by ensuring that lower-level specifications satisfy each top-level requirement. The help prevent requirements creep by verifying the contractual authority of each bottom-level requirement. This is called traceability.

One can also take the requirements database to the next level by adding (or linking to it) a Verification layer. The Verification layer identifies the Test plan that verifies each requirement. A robust database could even contain the test documents and results.

Case Study

A friend, Jim Pannunzio, once worked on a contract for a first-of-its-kind product. Since many requirements had high risk of being unachievable, the contract provided incentive fees for accomplishing the high-risk requirements.

The company had not used the right tools, so they had not traced some successful tests to the requirements and had failed to bill the customer. This cost them money. When Jim cross-referenced the verifications to the requirements, he found a number of such requirements. Jim also found requirements that the company had failed to verify. This allowed him to suggest further verification.

By identifying untested requirements and untraced successful verifications, Jim’s attention to detail brought hundreds of thousands of dollars of added revenue to his company and turned the project from an economic failure into a success.


Tight integration between requirements and verification provided by a VCRM forms a vital link between design and thorough quality control, between contractual provisions and acceptance, and between the buyer’s and seller’s bank accounts.

For more information see:

Scukanec, Stephen. A Day in the Life of a Verification Requirement. Northrop Grumman. 28 January 2008.
Scukanec, Stephen, and James van Gaasbeek. A Day in the Life of a Verification Requirement. Northrop Grumman. 24 October 2007.


Figure 1: Goddard Space Flight Center, Greenbelt, Maryland. National Polar-Orbiting Operational Environmental Satellite System (NPOESS) Preparatory Project (NPP), Science Data Segment Requirements Specification, GSFC 429-05-11-01. April 7, 2005. Downloaded 1 June 2011.

Figure 2: Office of Operations, Federal Highway Administration, Department of Transportation. Testing Programs for Transportation Management Systems: A Technical Handbook. Appendix A, Example Verification Cross Reference Matrix. June 20, 2007. Downloaded 1 June 2011.

Figure 3: Engineer Research And Development Center, Corps Of Engineers, U.S. Army Topographic Engineering Center, Alexandria, VA. Performance Specification: Improved
Position and Azimuth Determining System (IPADS), MIL-PRF-52955C. 8 November 2002. Downloaded 1 June 2011

Figure 4: Baun, Rich. GLAST Large Area Telescope: LAT Pre-Shipment Review: LAT Level Test Verification Process. Gamma-ray Large Area Space Telescope. 27 April 2006. Downloaded 1 June 2011.

Copyright 2011, 2013, Richard Wheeler -- Permission granted for non-profit or personal use with a link to this post.

IT Metrics and Productivity Institute (ITMPI) Premium membership gives members free access to 400 PDU-accredited webinar recordings and waives the PDU processing fees. The library is growing at about 100 webinars per year. Check it out: