Sometime in the 1990’s, I drafted a frequently asked question list for NASA’s IV&V facility. Here’s what I wrote on the meaning of the terms “validation” and “verification”:
The terms Verification and Validation are commonly used in software engineering to mean two different types of analysis. The usual definitions are:
- Validation: Are we building the right system?
- Verification: Are we building the system right?
In other words, validation is concerned with checking that the system will meet the customer’s actual needs, while verification is concerned with whether the system is well-engineered, error-free, and so on. Verification will help to determine whether the software is of high quality, but it will not ensure that the system is useful.
The distinction between the two terms is largely to do with the role of specifications. Validation is the process of checking whether the specification captures the customer’s needs, while verification is the process of checking that the software meets the specification.
Verification includes all the activities associated with the producing high quality software: testing, inspection, design analysis, specification analysis, and so on. It is a relatively objective process, in that if the various products and documents are expressed precisely enough, no subjective judgements should be needed in order to verify software.
In contrast, validation is an extremely subjective process. It involves making subjective assessments of how well the (proposed) system addresses a real-world need. Validation includes activities such as requirements modelling, prototyping and user evaluation.
In a traditional phased software lifecycle, verification is often taken to mean checking that the products of each phase satisfy the requirements of the previous phase. Validation is relegated to just the begining and ending of the project: requirements analysis and acceptance testing. This view is common in many software engineering textbooks, and is misguided. It assumes that the customer’s requirements can be captured completely at the start of a project, and that those requirements will not change while the software is being developed. In practice, the requirements change throughout a project, partly in reaction to the project itself: the development of new software makes new things possible. Therefore both validation and verification are needed throughout the lifecycle.
Finally, V&V is now regarded as a coherent discipline: ”Software V&V is a systems engineering discipline which evaluates the software in a systems context, relative to all system elements of hardware, users, and other software”. (from Software Verification and Validation: Its Role in Computer Assurance and Its Relationship with Software Project Management Standards, by Dolores R. Wallace and Roger U. Fujii, NIST Special Publication 500-165)
Having thus carefully distinguished the two terms, my advice to V&V practitioners was then to forget about the distinction, and think instead about V&V as a toolbox, which provides a wide range of tools for asking different kinds of questions about software. And to master the use of each tool and figure out when and how to use it. Here’s one of my attempts to visualize the space of tools in the toolbox:
For climate models, the definitions that focus on specifications don’t make much sense, because there are no detailed specifications of climate models (nor can there be – they’re built by iterative refinement like agile software development). But no matter – the toolbox approach still works; it just means some of the tools are applied a little differently. An appropriate toolbox for climate modeling looks a little different from my picture above, because some of these tools are more appropriate for real-time control systems, applications software, etc, and there are some missing from the above picture that are particular for simulation software. I’ll draw a better picture when I’ve finished analyzing the data from my field studies of practices used at climate labs.
Many different V&V tools are already in use at most climate modelling labs, but there is room for adding more tools to the toolbox, and for sharpening the existing tools (what and how are the subjects of my current research). But the question of how best to do this must proceed from a detailed analysis of current practices and how effective they are. There seem to be plenty of people wandering into this space, claiming that the models are insufficiently verified, validated, or both. And such people like to pontificate about what climate modelers ought to do differently. But anyone who pontificates in this way, but is unable to give a detailed account of which V&V techniques climate modellers currently use, is just blowing smoke. If you don’t know what’s in the toolbox already, then you can’t really make constructive comments about what’s missing.
Pingback: Tweets that mention The difference between Verification and Validation | Serendipity -- Topsy.com
Pingback: Validating Climate Models | Serendipity
Great article. Thank you!
This was a very useful description. The diagram was especially useful in analyzing verification vs validation testing.
thnx for this article. . .this description with diagrams help me to cmplt my assignment. . . 🙂
very nice and really helpful article… it gives a clear idea of verification and validation.
Thanks..!!
Pingback: The difference between Verification and Validation | Nicole's world
useful article on verification and validation. thanks!
tks for this very interesting and informative article
Thanks for this article. This will be helpful in my final year project.
very useful article thanks
need it for the list of reference for my final year project. thank you.
very useful article …. thank you
Useful post. It helped alot to understand the differences between Validation and verification.
Thanks again for this awesome article.
This is the best different of validation and verification so read care fully
Thanks
Thanks a lot for very clear and informative article!
Pingback: V&V Verification and Validation | henrylkw
who should do verification and validation
So clear… Thanks a lot…. 🙂
thank you for very clear and informative article
Pingback: Software Verification and Validation – Software Engineering
Pingback: Software Verification and Validation | Hermes's Blog
Pingback: Software verification and validation – Fundamentals of Software Engineering
very simple and clear explanation. thank you
Very helpful and informative thanks to the Admin. Also have a look of this also
Hi there,
Thanks for this very helpful article.
The question I have, is: why is there such an emphasis on Validation at all? As you said, validation is a very subjective process. What one person thinks might be a “valid” feature may be considered useless to another person. The perception of “user needs” can vary widely. Companies often release new types of products with a hope that they will meet consumers’ needs, but no one can predict whether these products will be successful or not. Whether a product meets user’s needs seems like a criteria that should be considered by marketing departments, but not enforced through a design oriented process.
To give a concrete example…. when the personal computer was created there were many ‘experts’ who said that it was foolish to have a computer in people’s homes. From a validation point of view, you would conclude that this product does not meet the end-user’s needs. However a number of innovative companies such as IBM and Apple released PCs anyway, and the rest is history. A strict V&V assessment might have killed the concept of personal computers. So why is Validation considered so important in the design process?
Mohan: Good question. The short answer is that it depends on what kind of software you’re building. For a lot of consumer-oriented software, as you suggest, validation can be left to the marketplace. Or for a higher success rate, work closely with (potential) users, using an iterative approach where you build and deliver in small increments, and get feedback on what users like and don’t like about it.
But if you’re building safety-critical software, validation is crucial. It turns out the root cause of most software-related accidents is failure to validate: the software did the wrong thing in a situation that the software developers should have anticipated. In other words, they didn’t validate their understanding of the problem domain.
Pingback: A Machine Learning glossary | Φ-lab