Validating the as is process model
Best practise indicates that institutions should apply a risk-based data validation, which enables the reviewer to consider risks unique to the organisation and the model.
To establish a robust framework for data validation, guidance indicates that the accuracy of source data be assessed.
The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model V&V for all safety-related nuclear facility design, analyses, and operations.
In fact, DNFSB 2002-1 recommends to the DOE and National Nuclear Security Administration (NNSA) that a V&V process be performed for all safety related software and analysis.
This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpilemore » certification via numerical simulation. Department of Defense (Do D) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for V&V applied to high-level systems such as ballistic missile defense and battle management simulations.
There has been a significant increase in activity recently to define V&V methods and procedures. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of V&V procedures for computational solid mechanics models.
A review of the processes also determines whether the models are producing output that is accurate, managed effectively, and subject to the appropriate controls.
An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing.
Validation is the process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model.
Both verification and validation are processes that accumulate evidence of a model's correctness or accuracy for a specific scenario; thus, V&V cannot prove that a model is correct and accurate for all possible scenarios, but, rather, it can provide evidence that the model is sufficiently accurate for its intended use.
For example, a poorly designed risk assessment model may result in a bank establishing relationships with clients that present a risk that is greater than its risk appetite, thus exposing the bank to regulatory scrutiny and reputation damage.
A validation should independently challenge the underlying conceptual design and ensure that documentation is appropriate to support the model’s logic and the model’s ability to achieve desired regulatory and business outcomes for which it is designed.Where gaps or limitations are observed, controls should be evaluated to enable the model to function effectively.3 Data Validation and Quality Assessment Data errors or irregularities impair results and might lead to an organisation’s failure to identify and respond to risks.This report reviews the concepts involved in such a program.