We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 13
Checking model validity
Verification of Models
Checking Model Validity and Verification of
Models Subject: Modeling and Simulation Presented by: Najam Validation Vs Verification • Validation • Ensures that a product or system meets the needs of its users and stakeholders. Validation is a dynamic process that involves testing and user feedback to confirm that the product meets its intended purpose. • Verification • Ensures that a product or system complies with requirements, specifications, or regulations. Verification is a static process that involves reviewing and analyzing documentation and design without executing any code Model Validity • Checking model validity in simulation is the process of confirming that a simulation model accurately represents a real system and its behavior. This is done by comparing the model's simulations to an independent data set that was not used to estimate the model's parameters Steps in Model Validation • 1. Define Objectives: What is the model intended to achieve? • 2. Data Collection: Gather data for comparison. • 3. Develop Metrics: Define quantitative measures for validation. • 4. Testing: Compare model outputs against real- world data. • 5. Iterative Refinement: Update the model based on discrepancies. Steps for Validating a Simulation Model • Define the model's output variable: Identify the variable that the model will output • Set an acceptable range of accuracy: Determine the range of accuracy that is acceptable for each variable • Compare to experimental data: Compare the model's simulated data to observed data points • Use graphical comparisons: Graph the behavior data of the model and the system for different experimental conditions • Use confidence intervals: Calculate confidence intervals for the differences between the model and system's output variables • Use hypothesis tests: Compare the model and system's output variables to determine if the model's accuracy is acceptable Types of Model Validation • 1. Face Validation: • Subject matter experts review the model. • 2. Predictive Validation: • Test the model’s ability to predict future outcomes. • 3. Historical Data Validation: • Compare outputs with past data.
• Model validity is different from model verification,
which is the process of checking that a model is free of errors and works as intended. Both validation and verification are ongoing activities that should be performed throughout the model development and analysis stages. Techniques for Model Validation • Sensitivity Analysis: • Assess how input changes affect output. • Comparison with Benchmarks: • Use established models as a reference. • Field Testing: • Test the model in real-world scenarios. • Statistical Testing: • Use statistical methods to confirm accuracy. Importance of Model Verification • Model verification and validation (V&V) are important because they help ensure that models are correct and reliable, and can be used to make accurate predictions. • Accuracy: V&V ensures that the model's outputs are accurate. • Reliability: V&V helps to assess the model's robustness and reliability. • Predictive accuracy: V&V helps to quantify the predictive accuracy of model calculations. • Decision-making: V&V provides decision-makers with the information they need to make high-consequence decisions. • Risk reduction: V&V can help to reduce the risk of errors by identifying issues with the model or data. Steps in Model Verification • Conceptual model validation • Determines if the model's assumptions and theories are correct, and if the model's structure, logic, and mathematical relationships are reasonable for its intended purpose. • Operational validation • Ensures that all connections between the distinct parts of the model are correctly defined. • Test set verification • An essential step in model building, as models trained without verification are unreliable for use with real- world data. • Model checking • A formal method for software and hardware system verification that checks if a model of a system satisfies given specifications. Common Verification Techniques • Walkthroughs and Inspections: • Manual code reviews. • Debugging Tools: • Identify logical and runtime errors. • Comparison with Analytical Solutions: • Verify model outputs with known solutions. • Formal verification • A mathematically rigorous technique that uses static analysis algorithms to prove the correctness of chip designs • Theorem proving • A technique that models a system mathematically, specifies the desired properties to be proven, and then performs the verification • Sequential equivalence checking (SEC) • A technique that verifies that two designs are functionally identical and give the same outputs when provided with the same inputs Challenges in Validation and Verification • Lack of accurate real-world data. • Complexity of real-world systems. • Subjectivity in validation metrics. • Computational limitations for detailed testing. Tools for Validation and Verification • MATLAB/Simulink: • For mathematical modeling. • Arena: • For discrete event simulation. • AnyLogic: • For multi-method modeling. • Python/R: • For custom statistical and analytical testing. • Astera. • Informatica. • Talend. • Datameer. • Alteryx. • Data Ladder. • Ataccama One. Conclusion • Summary: Validation and verification are critical for reliable modeling and simulation. • Takeaways: Always validate with real-world data and verify implementation accuracy. • Questions and Discussion: Open floor for audience queries.
Application of AI in Credit Risk Scoring For Small Business Loans: A Case Study On How AI-based Random Forest Model Improves A Delphi Model Outcome in The Case of Azerbaijani SMEs.