Certification of Safety-Critical Software Under DO-178C and DO-278A
Certification of Safety-Critical Software Under DO-178C and DO-278A
and DO-278A
Stephen A. Jacklin1
NASA Ames Research Center, Moffett Field, CA, 94035
The RTCA has recently released DO-178C and DO-278A as new certification guidance for
the production of airborne and ground-based air traffic management software, respectively.
Additionally, RTCA special committee SC-205 has also produced, at the same time, five
other companion documents. These documents are RTCA DO-248C, DO-330, DO-331, DO-
332, and DO-333. These supplements address frequently asked questions about software
certification, provide guidance on tool qualification requirements, and illustrate the
modifications recommended to DO-178C when using model-based software design, object
oriented programming, and formal methods. The objective of this paper is to first explain
the relationship of DO-178C to the former DO-178B in order to give those familiar with DO-
178B an indication of what has been changed and what has not been changed. With this
background, the relationship of DO-178C and DO-278 to the new DO-278A document for
ground-based software development is shown. Last, an overview of the new guidance
contained in the tool qualification document and the three new supplements to DO-178C and
DO-278A is presented. For those unfamiliar with DO-178B, this paper serves to provide an
entry point to this new certification guidance for airborne and ground-based CNS/ATM
software certification.
I. Introduction
1
R TCA DO-178B has long been regarded as a document providing the premier means or path to obtain FAA
certification of software to be used in airborne systems. DO-178B was not intended to be a process guide for
software certification, but rather a description of what high-quality software development processes should be put
in-place to create airborne software that performs its desired function. In principal, if life cycle evidence can be
produced to demonstrate that these processes have been correctly and appropriately implemented, then such
software should be certifiable. The document is maintained by the RTCA (Radio Technical Commission for
Aeronautics, established 1935), which is a private association of over 250 aeronautical organizations, including the
FAA, NASA, DOD, other government agencies, airline manufacturer, airline operators, aircraft equipment suppliers,
and various pilot associations.
Seven years ago, the RTCA created special committee 205 (SC-205) to produce a revision of DO-178B to
account for new software development and verification technologies that were deemed immature at the time DO-
178B was written. The new version, DO-178C “Software Considerations in Airborne Systems and Equipment
Certification”2, was released in December 2011. Rather than placing all of the new guidance in DO-178C, the
special committee decided to place the vast majority of the new guidance in six other documents. These documents
were released together with DO-178C. They are:
• RTCA DO-278A3: Software Integrity Assurance Considerations for Communication, Navigation,
Surveillance and Air Traffic Management (CNS/ATM) Systems
• RTCA DO-248C4: Supporting Information for DO-178C and DO-278A
• RTCA DO-3305: Software Tool Qualification Considerations
• RTCA DO-3316: Model-Based Development and Verification Supplement to DO-178C and DO-278A
• RTCA DO-3327: Object-Oriented Technology and Related Techniques Supplement to DO-178C and
DO-278A
• RTCA DO-3338: Formal Methods Supplement to DO-178C and DO-278A
1
Aerospace Engineer, Intelligent Systems Division, Mail Stop 269-2, Senior Member AIAA
1
American Institute of Aeronautics and Astronautics
DO-178, DO-178A
Predecessor Documents
Ground-Based
Airborne CNS/ATM
Software Software
Cert. Guidance Cert. Guidance
Figure 1. RTCA airborne and CNS/ATM software certification-related documents introduced in December
of 1992. Dashed lines indicate supplements.
Figure 1 illustrates the functional relationship of airborne and CNS/ATM software certification-related documents
published by RTCA prior to December 2011. DO-178B was a derivative product of DO-178A, DO-178, and other
documents and was released in December 1992. The guidance contained in DO-178B was intended to be applicable
to both airborne and ground-based software development. DO-278 was intended to be a supplemental document to
modify the guidance of DO-178B for CNS/ATM software. Hence, both DO-178B and DO-278 together were to be
referenced for the ground side. DO-248B was an additional supplement that provided no additional certification
guidance, but contained an appendix of frequently asked software certification questions, several discussion papers
of key DO-178B concepts, and the rationale used to create the DO-178B and DO-278 documents. The boxes in
dashed lines indicate supplemental documents that were not intended to be complete in themselves.
Figure 2 illustrates how the new documents introduced by RTCA in December 2011 for DO-178C and DO-278A
relate to each other. In this diagram, the dashed boxes indicate supplemental documents that are not intended to be
used on their own. The supplemental documents modify the guidance contained in DO-178C and DO-278A. On the
airborne side, DO-178C is the key document and it is a direct derivative of DO-178B. On the ground side, DO-278A
is the key document, but it is not a direct derivative of DO-278. Rather, DO-278A combines the guidance of DO-
178C and DO-278 to produce a stand-alone reference for ground-based software verification. For both airborne and
ground-based software, DO-331, DO-332, and DO-333 provide additional guidance for software using model-based
development, object-oriented programming, and formal methods, respectively. These supplements provide
additional guidance for both DO-178C and DO-278A, but need not be used if not applicable. DO-330 is a stand-
alone document containing guidance for tool qualification and is intended to be used not only by persons using tools
to verify software or auto-generate code, but also by persons developing the software tools. The tool developers
need not read DO-178C or DO-278A because the guidance contained in those documents that is relevant for tool
software development is repeated in DO-330.
The purpose of this paper is to provide an overview of the new guidance for safety-critical airborne and ground-
based CNS/ATM software contained in DO-178C, DO-278A, and the other documents. In section II, the similarities
of DO-178C to DO-178B will be presented by reviewing the basics of the DO-178B verification philosophy. In
section III, an overview of the major new guidance contained in DO-178C is presented to highlight what has been
changed. Section IV discusses the relationship of DO-278A developed for ground-based CNS/ATM software to the
guidance presented in DO-178C for airborne software. The remaining sections of the paper provide a discussion of
the new guidance contained in the other documents; section V for DO-248C, section VI for DO-330, section VII for
DO-33, section VIII for DO-332, and section IX for DO-333.
Within the scope of this paper it is not possible to cite all or even most of the guidance contained in the new DO-
178C document set from RTCA. Taken as a whole, the new documents comprise over 1000 pages of new
documentation. The interested reader must download these documents from RTCA9 in order to fully appreciate and
apply the new guidance. This paper provides an entry point for those interested in understanding the scope of these
publications.
2
American Institute of Aeronautics and Astronautics
DO-248B
DO-178B DO-278
DO-248C FAQ &
Discussion Papers
DO-178C DO-278A
DO-330
Tool Qual.
Ground-Based
Airborne Software CNS/ATM Software
Cert. Guidance Cert. Guidance
Figure 2. Relationship of the new RTCA document set for airborne and CNS/ATM software certification
introduced in December 2011.
3
American Institute of Aeronautics and Astronautics
B. Software Development Plan.
The Software Development Plan identifies the method of software development. It specifies the coding
standards, programming languages, software testing, debugging tools, software development procedures, and the
hardware used to develop and execute the software. As stated by DO-178C (section 4), the purpose is “to choose
requirements development and design methods, tools, and programming languages that limit the opportunity for
introducing errors, and verification methods that ensure that errors introduced are detected”. Tools include things
like compilers, linkers, and V&V tools. Reviews are to be regularly conducted throughout the software development
process to ensure that the Software Development Plan is being followed.
E. Certification.
The certification process described in DO-178C (section 10) is the same as that presented in DO-178B. Software
certification (technically, “approval”) is obtained as the result of the certification authority agreeing that the Plan for
Software Aspect of Certification (PSAC) has been fulfilled. In the United States, the authority responsible for
certifying aircraft flight software is the Federal Aviation Administration (FAA). The PSAC is developed in
collaboration between the software developer’s Designated Engineering Representative (DER) and the FAA. The
same certification liaison process presented in DO-178B is also contained in DO-178C (section 9).
F. Other Similarities.
The sections contained in DO-178C describing the software life cycle (section 3), the software configuration
management plan (section 7), the software quality assurance plan (section 8), the software development standards
4
American Institute of Aeronautics and Astronautics
(section 4.5), the software design standards (section 11), and the overall verification activities to be performed are
generally the same as those presented in DO-178B.
5
American Institute of Aeronautics and Astronautics
E. Tool Qualification Levels.
DO-178C states that the tools used to generate software or to verify software must themselves be verified to be
correct. This tool verification process is called qualification. Moreover, a tool such as a compiler qualified for one
project is not necessarily qualified for a different project. DO-178C distinguishes between tools used to
automatically generate software from tools used to automate some portion of the verification process. In general,
DO-178C requires greater scrutiny of software generation tools. DO-178C sets forth five tool qualification levels
(TQL-1 – TQL-5) based on the software level, and whether the tool is used to generate software, to verify software,
or used to detect software errors. DO-178C refers the reader to the tool qualification supplement (DO-330) for
specific guidance.
IV. RTCA DO-278A: Software Integrity Assurance Considerations for CNS/ATM Systems
DO-278A provides guidance for the production of ground-based, Communication, Navigation, Surveillance, and
Air Traffic Management (CNS/ATM) software, just as DO-178C provides guidance for the production of airborne
software. Because the former DO-278 was intended to be used as a supplement to DO-178B (see Fig. 1), CNS/ATM
software developers were required to be familiar with DO-178B. DO-278 described the additions, deletions, and
modifications to DO-178B that applied to the verification of ground-side software.
In contrast, DO-278A was created by combining DO-178C and DO-278 to make a single, stand-alone document.
As a result, DO-278A may be used without reference to DO-178C. Both DO-178C and DO-278A have the same
section names and use the same section numbers. The differences are that some subsections have been added to DO-
278A. A good many of the differences between DO-178C and DO-278A are produced by changes in terminology,
for example:
• “Software level” in DO-178C was replaced with “assurance level” in DO-278A
• “Certification” authority in DO-178C was replaced with “approval” authority in DO-278A
• “Aircraft” or “airborne system” in DO-178C was replaced with “CNS/ATM system” in DO-278A
• “Adaptation” data in DO-178C was replaced with “parameter” data in DO-278A
• The “Plan for Software Aspects of Certification (PSAC)” in DO-178C is referred to as “Plan for Software
Aspects of Approval (PSAA)” in DO-278A
B. Tool Qualification.
DO-278A contains essentially the same tool qualification guidance contained in DO-178C. Like DO-178C, DO-
278A requires software development and verification tools are qualified when the processes used in DO-278A are
eliminated, reduced, or automated by the use of software tools. The main difference is that DO-278A takes into
account the additional assurance level used for CNS/ATM systems. DO-278A refers the reader to DO-330 for an in
depth discussion of the activities and guidance for tool qualification.
C. Service Experience.
The Product Service History section from DO-178C was expanded and added to DO-278A as the Service
Experience section. This section describes what previous usage of a software product can be counted toward
6
American Institute of Aeronautics and Astronautics
Table 1: Comparison of DO-178C software levels and DO-278A assurance levels.
Software Failure Effect Category DO-178C Software Level DO-278A Assurance Level
(Airborne Software) (CNS/ATM Software)
assurance credit. DO-278A specifies the requirements for receiving credit for product service history. The main
objective is to verify that an equivalent level of safety is provided by the service experience history as would be
otherwise obtained by following the standard DO-278A guidance. Whereas DO-178C identifies flight-hours as a
useful metric for airborne software service experience, DO-278A cites in-service hours as an appropriate metric for
CNS/ATM systems. DO-278A also provides guidance for systems having deactivated code and for those systems
using recovery methods to recover from software or system errors.
D. COTS Software.
DO-278A includes an extensive section on the use of Commercial Off-The-Shelf (COTS) software in CNS/ATM
systems. This section expands the COTS material presented in DO-278. In DO-278A, COTS software is software
that is sold by commercial vendors without modification or development of the software required. Any software
needed to integrate COTS software into a CNS/ATM system (e.g., wrapper code) is approvable only if it is
developed in a manner that fulfills all the objectives of DO-278A.
The guidance provided by DO-278A for COTS software aims to ensure that the level of confidence in COTS
software is the same as that for software developed according to the standard guidance provided in DO-278A. In
order to identify any software development weaknesses of COTS software, DO-278A recommends that a gap
analysis be performed to identify the extent to which the objectives of DO-278A can be demonstrated to have been
achieved by the COTS software. An assurance plan should be developed to specify how the gaps will be satisfied for
the assurance level sought. DO-178A recommends that a COTS software integrity assurance case be developed that
provides a rationale for demonstrating that the software meets its requirements through a rigorous presentation of
claims, arguments, evidence, assumptions, justifications, and strategies. As such, COTS software must essentially be
shown to meet all the objectives of DO-278A. DO-278A presents an extensive explanation of the software planning,
objectives, activities, acquisition, verification, configuration management and quality assurance processes and
objectives in Section 12 and in the tables of Annex A.
7
American Institute of Aeronautics and Astronautics
DO-248C provides a wealth of discussion papers that contain explanatory information supporting the guidance
found in DO-178C and DO-278A. Those who worked on SC-205 will recognize that these discussion papers
encapsulate the great debates held during the formulation of DO-178B and DO-178C. Discussion papers were the
primary means SC-205 members used to facilitate discussion of proposed changes to DO-178B. Most are short (1-2
page) documents that describe the supporting rationale for a proposed change. Literally hundreds of discussion
papers were written over the course of the project.
DO-248C also presents an appendix of 84 frequently asked questions and answers. Examples are:
Does the DO-178C/DO-278A definition of COTS software include software option-selectable software?
What needs to be considered when performing structural coverage at the object code level?
How can all Level D (AL-5) objectives be met if low-level requirements and source code are not required?
The last section of DO-248C presents 11 rationale arguments, one to discuss the intended use of each section in
DO-178C (2 thru 12) plus a rationale for the creation of the supplements to DO-178C and DO-278A.
It is important to note that while DO-248C is interesting and useful, it does not provide any additional
certification or approval guidance for airborne or ground-based CNS/ATM software. It provides a large quantity of
explanatory material and a record of the great arguments and rationale developed while writing the new guidance.
A. Model Requirements.
DO-331 makes a distinction between requirements for specification models and requirements for design models.
Specification models use high-level software requirements to state model functional, performance, interface, and
safety characteristics. Design models use primarily low-level requirements and software architecture specifications
to represent internal data structures, data flow, and control flow. In either case, DO-331 requires that the models
specify the configuration items, modeling techniques, model element libraries, interface descriptions, configuration
items, and model development environment. Traceability between the design model, low-level requirements and the
model code is required. There should be no model code that cannot be traced back to low-level requirements.
C. Model Simulation.
In order to obtain certification credit for simulation, DO-331 requires that the applicant clearly show what
reviews and analyses are needed to satisfy the model verification objectives. The analyses must show that simulation
cases exist for each model requirement and that these simulations address both normal range and robustness test
inputs. Verification of the executable object code is encouraged to be primarily performed by testing in the target
9
American Institute of Aeronautics and Astronautics
computer environment. The objective of model simulation is to verify that the model satisfies the requirements used
to create it and to gather evidence that the model is accurate, consistent, and compatible with all system-level, high-
level, and low-level requirements.
VIII. RTCA DO-332: Object Oriented Technology and Related Techniques Supplement
RTCA DO-332 was written to provide guidance on the use of object oriented programming languages that use
concepts such as inheritance, polymorphism, overloading, type conversions, exception management, dynamic
memory management, virtualization, and other concepts not universally in common usage at the time DO-178B was
written. The DO-332 supplement is very well written and includes much explanatory text concerning the basic
features of object-oriented programming.
DO-332 identifies the additions, modifications, and deletions to be made to the DO-178C (or DO-278A)
objectives and activities when object-orientated techniques are used in airborne or ground-based software. Annex A
of DO-332 contains modifications to the verification activities specified in DO-178C, while Annex C presents
modifications to the verification activities specified in DO-278A. Annex D provides a discussion of vulnerabilities
associated with the use of objected-oriented technologies. The highlights of the new guidance are presented below.
C. Vulnerability Analysis.
DO-332 presents a vulnerability analysis discussion in Annex D. The purpose is to describe the complications
that may arise with the use of object-oriented technologies. These special problems are associated with inheritance,
parametric polymorphism, overloading, type conversion, exception management, dynamic memory management,
and virtualization. Examples of the extensive verification guidance provided by the annex include:
10
American Institute of Aeronautics and Astronautics
With regard to inheritance, a demonstration of type consistency by verifying that each subclass is
substitutable for its superclass is recommended. Every verification activity performed on the superclass should
also be performed on the subclass.
For software having parametric polymorphism, verification activities should show that operations acting on
substituted parameters implement the intended semantic behavior. Each unique instantiation of a parameterized
type or combination of types should be verified.
To minimize the problems associated with overloading, the use of explicit type conversion should be used to
reduce overloading ambiguity. Verification activities should ensure that type conversions (implicit and explicit)
are safe and that all implications are understood.
It is recommended that a strategy be developed to handle all exceptions such as range checks, bounds checks,
divide-by-zero checks, or checks on post-conditions. It is desired that all code modules handle exceptions in the
same way.
It recommended that an automatic method of memory reclamation be provided instead of relying on the
correct use of malloc() and free() for dynamic memory management.
It is advised that worst-case execution timing be performed considering all in-code dynamic memory
operations. Separate threads used for memory management (e.g., garbage collection) and should be considered
as part of the task load.
Annex D also discusses activities for verification of traceability, structural coverage, component-based
development, memory management, and timing for object-oriented programs. Though procedural programming
techniques require verification of these as well, object-oriented programming requires additional analyses. DO-332
recommends verifying traceability between the requirements of a subclass and the requirements of all of its
superclasses to ensure type substitutability. Traceability should be shown between object code and the source code if
multiple dynamic dispatches are possible through a call point. Detailed discussion of these points and many others
are presented in the annex of DO-332.
A. Formal Models.
DO-333 considers a formal model to be an abstract representation of certain aspects of a system (or code) for
which the model notation uses precise, unambiguous, and mathematically defined syntax and semantics. The models
may be graphical (e.g., state machine diagrams), differential equations, or computer languages. Because formal
notations are precise and unambiguous, they can be used to assist verification by helping show accuracy and
consistency in the representation of requirements and life cycle data. DO-333 does not require all of the
requirements of a formal model to be formally expressed. However, if the high-level requirements and low-level
requirements are both formally modeled, then formal analysis can be used to show compliance. DO-333 defines
formal analysis as the use of mathematical reasoning to guarantee that properties are always satisfied by a formal
model.
D. Coverage Analysis.
DO-333 discusses the ways in which formal analysis may be used to satisfy the coverage requirements of DO-
178C and DO-278A. This guidance states that when any low-level testing is used to verify that low-level
requirements for a software component are satisfied, then the DO-178C guidance for structural coverage analysis
should be followed. When only formal methods are used to verify that low-level requirements are satisfied, then the
guidance in DO-333 (section 6) applies. The supplement states that although it is possible to use a mixture of testing
and formal analysis to show that verification evidence exists for all high-level and low-level requirements, no known
test cases exist. In this case, the supplement permits certification authorities to approve software coverage if it can
be demonstrated by a combination of methods that structural and requirements-based coverage have been achieved.
DO-333 requires that all assumptions made during the formal analysis are verified. It should be demonstrated
that for all input conditions, the required output has been specified; and likewise, for all outputs, the required input
conditions have been specified. Analysis test cases should provide evidence that the formal analysis achieves the
required coverage level. All code structures must be shown to be covered by either formal or procedural analyses.
DO-333 states that when a combination of formal methods and testing are used to assess coverage, functional
(requirements-based) tests executed on the target hardware should always be done to ensure that the software in the
target computer will satisfy the high-level requirements.
X. Conclusion
This paper provided an overview of the new certification guidance contained in RTCA DO-178C, DO-278A,
DO-330, and the related supplemental documents created by RTCA SC-205 for the development of safety-critical
airborne and CNS/ATM ground-based software. The objective of this paper was to help those not familiar with the
new DO-178C documentation set to gain an appreciation for the scope of the information contained in the nearly
1000 pages of new guidance material from RTCA. A review of the DO-178B software verification guidance was
presented prior to discussing the new material introduced in DO-178C and DO-278A. Following this, an overview
of the new content contained in DO-178C for airborne software verification and in DO-278A for ground-based
CNS/ATM software verification was discussed. Then the highlights of new guidance contained in the other
documents supporting DO-178C and DO-278A were presented in subsequent sections. These other documents are:
• RTCA DO-248C4: Supporting Information for DO-178C and DO-278A
• RTCA DO-3305: Software Tool Qualification Considerations
• RTCA DO-3316: Model-Based Development and Verification Supplement to DO-178C and DO-278A
• RTCA DO-3327: Object-Oriented Technology and Related Techniques Supplement to DO-178C and
DO-278A, and
• RTCA DO-3338: Formal Methods Supplement to DO-178C and DO-278A.
Although within the scope of this paper it was not possible to present every detail of the new guidance, it is
hoped that the summary information contained herein will stimulate interest in these publications. The reader
requiring specific information must download these documents from RTCA9 in order to fully appreciate and apply
the new guidance.
12
American Institute of Aeronautics and Astronautics
Acknowledgments
The new DO-178C, DO-278A, and companion documents are the work of RTCA Special Committee 205 (SC-
205). Although the first credit goes to the RTCA staff, the organizing work of the SC-205 leadership team also
deserves special recognition. The author especially wishes to acknowledge the effort and devotion of SC-205 chairs,
Jim Krodel (Pratt & Whitney) and Gerard Ladier (Airbus), and executive committee members Barbara Lingberg
(FAA-CAST Chair), Mike DeWalt (FAA), Leslie Alford (Boeing), Ross Hannan (Sigma Associates), Jean-Luc
Delamaide (EASA), John Coleman (Dawson Consulting), Matt Jaffe (ERAU), and Todd White (L-3
Communications/Qualtech). Also deserving special recognition are the leads and co-leads of the subgroups: Ron
Ashpole (Silver Atena), Ross Hannan (Sigma Associates), Frederic Pothon (ACG Solutions), Leanna Rierson
(Digital Safety Consulting), Pierre Lionne (EADS APSYS), Mark Lillis (Goodrich GPECS), Herve Delseny
(Airbus), Jim Chelinni (Verocel), Duncan Brown (Rollys-Royce), Kelly Hayhurst (NASA), David Hawkens
(NATS), and Don Heck (Boeing). Several others who chaired these committees prior to publication of DO-178C
and the SC-205 committee members themselves whose names are far too numerous to mention are cited in
Appendix A of DO-178C (including the author of this paper).
The author’s support of RTCA SC-205 was provided by the NASA Aviation Safety Program, the NASA
Intelligent Resilient Aircraft Control (IRAC) Project, and the NASA System-Wide Safety and Assurance
Technologies (SSAT) Project.
References
1
RTCA DO-178B, “Software Considerations in Airborne Systems and Equipment Certification,” December 1992.
2
RTCA DO-178C, “Software Considerations in Airborne Systems and Equipment Certification,” December 2011.
3
RTCA DO-278A, “Software Integrity Assurance Considerations for Communication, Navigation, Surveillance and Air
Traffic Management (CNS/ATM) Systems,” December 2011.
4
RTCA DO-248C, “Supporting Information for DO-178C and DO-278A,” December 2011.
5
RTCA DO-330, “Software Tool Qualification Considerations,” December 2011.
6
RTCA DO-331, “Model-Based Development and Verification Supplement to DO-178C and DO-278A,” December 2011.
7
RTCA DO-332, “Object-Oriented Technology and Related Techniques Supplement to DO-178C and DO-278A,” December
2011.
8
RTCA DO-333, “Formal Methods Supplement to DO-178C and DO-278A,” December 2011.
9
RTCA, www.rtca.org
13
American Institute of Aeronautics and Astronautics