DO-178C Workflow With Qualified Code Generation
DO-178C Workflow With Qualified Code Generation
Simulation, Processor and Hardware in the Loop Test Cases Traceability to HLR LLR and Source Code Traceability to HLR
Design constraints (FM-B) Design Error Detection and (FM-C) Property Proving Requirements from
(Equivalence classes, Boundary Simulink Design Verifier Software which the Model is
Specification
Model
Specification
Model
Design Model
Values, Derived Requirements) DO-331: Table MB.A-3 and MB.C-3 Verification of Requirements Process (Obj 2, 4 and 7) Requirement developed
Table MB.A-4 and MB.C-4 Verification of Design Process (Obj 2, 4, 7, 9,11) Design Model
and Software
DO-333: Table FM.A-3 and FM.C-3 Verification of Requirements Process (Obj 8 to 11) Design Processes Textual
Design Model Design Model
(FM-A) Automatic Test Table FM.A-4 and FM.C-4 Verification of Design Process (Obj 14 to 17) description
Case Generation Note: Formal Methods allow to detect errors in the Model including dead logic, integer Software Coding
overflow, division by zero, and violations of design properties and assertions, out-of- Source Code Source Code Source Code Source Code Source Code
Process
Design Model is a model that defines any software design such as low-level requirements, software architecture, algorithms,
component internal data strictures, data flow and/or control flow. A model used to generate Source Code is a Design Model.
Simulation Test
Cases Traceability Testing Environment Settings DO-331: Table MB.A-3 and MB.C-3 Verification of Requirements Process (Obj 10)
Table MB.A-4 and MB.C-3 Verification of Design Process (Obj 16)
*p = 0;
Range data p++; variable ‘I’ (int32): [0 .. 99]
tool tip } assignment of ‘I’ (int32): [1 .. 100] DO-331 MB.A-4 Simulation Results Report
Automatic Code Generation
DO-331: Table MB.A-5 and C.5 Verification of Coding and Integration Process (Obj 3 to 6)
DO-331 MB.A-4 Model Coverage Report
(verification of structure and
DO-332: Table OO.A-5 and C.5 Verification of Coding and Integration Process (Obj 3 to 6)
Automatic Code Inspection
Orange: unproven
may be unsafe for
i = get_bus_status(); Simulink Report Generator DO-331 MB.A.5 Source Code Report
some conditions if (i >= 0) { DO-331 MB.A.5 Coding Standards Report
*(p - i) = 10;
Purple: violation }
DO-331: Table MB.A-5 and C.5 Verification of Coding and Integration Process (Obj 2, 3, 6) DO-331: Table MB.A-3 Verification of DO-331 MB.A.5 Code Inspection Report
DO-332: Table OO.A-5 and C.5 Verification of Coding and Integration Process (Obj 2, 3, 6)
MISRA-C code rules }
DO-333: Table FM.A-5 and C.5 Verification of Coding and Integration Process (Obj 2, 3, 6 and 10 to 13)
Requirements Process (Obj 1 to 7) DO-331 MB.A.6 Code Coverage Report
Table MB.A-4 Verification of
Table FM.A-6 and C.6 Testing of Outputs of Integration Process (Obj 1-4) Design Process (Obj 1 to 6, 8 to 12) DO-333 FM.A.6 Run-Time Error Report
SIL and PIL Test Cases Traceability to LLR and HLR
Table FM.A-7 and C.7 Verification of Verification Process Results (Obj 1, 2 and 5 to 10)
DO-331 MB.A.7 Low Level Test Cases
DO-331 MB.A.7 EOC Test Results Report
Verification Objectives Settings Polyspace Code Prover Prove Absence of Run-Time Errors
SOURCE CODE
Software in the Loop (SIL) Testing
Code Coverage Qualified Verification (*) Data needed to verify the unquilified
Simulink Coverage Unqualified
Tools setup
Input Files
Tools setup Input (*) tool output files. These data may include
the unquilified input files
Reuse of the Simulation Test Cases
Simulink Test - Table A-7 Verification of Verification Process Results (Obj 5 to 7)
Compiler
Adding EOC specific Test Cases - Table M.B.A-7 Verification of Verification Process Results (Obj 5 to 7) Unqualified Generation Tool
Output
Output
Output
Files Qualified Verification Tool
Files
Files Optional
SW Statement data coupling and Mandatory
MC/DC Decision coverage
DO-178C: Table A-6 Table A-6: Testing of Outputs of Integration Process (Obj 1 to 4) DAL coverage control coupling Verification Data Under Verification
DO-178C: Table A-7 Verification of Verification Process Results (Obj 3 and 4) A 100% (Ind) 100% (Ind) 100% (Ind) 100% (Ind) Results
B 100% (Ind) 100% (Ind) 100% (Ind)
C 100% 100% Tool Criteria Definition
SW Tool Qualification Criteria
Level 1 2 3 1: Development Tool whose output is part of the resulting SW and thus could insert and error
A TQL-1 TQL-4 TQL-5 2: Verification Tool that automates verification process (es) and thus could fail to detect and error,
and whose output is used to justify the elimination or reduction of:
B TQL-2 TQL-4 TQL-5
- Verification process (es) other than that automated by the tool, or
C TQL-3 TQL-5 TQL-5 - Development process (es) that could have an impact on the airborne (or NS/ATM) SW
Processor and Hardware in the Loop (PIL and HIL) Testing EXECUTABLE OBJECT D TQL-4 TQL-5 TQL-5 3: Verification Tool that automates verification process(es) and thus could fail to detect and error
DO-178C supplements
CODE
Unqualified Tools • Tools Requirements, User Manual and other MathWorks documentation
Qualified Tools
• DO-330 for Tool Qualification DO Qualification Kit • Workflow Documentation and Tool Qualification Plans templates
• Verification Inputs Test Cases and Expected Results
• DO-331 for Model-Based Design
Configuration Inputs
• DO-332 for Object Oriented Techniques
Artifacts
• DO-333 for Formal Methods
Effort Distribution in Traditional Development Workflows Effort Distribution in Model-Based Design Workflows
Specifications
Customer’s quotes
claim a total effort
reduction around 30%
Unit Design & Implementation Unit Unit Design &
Specifications
Reqs Validation (C, C++, HDL, …) Verification Reqs Validation
Implementation
(C, C++, HDL, …)
Unit
Verification