Relation to safety standards

Safety standards like ISO 26262, DO-178B, DO-178C, IEC-61508, and EN-50128 require identifying functional and non-functional hazards and demonstrating that the soft­ware does not violate the rele­vant safety goals.

Some non-functional safety hazards can be critical for the correct functioning of the system: violations of timing constraints in real-time software and software crashes due to runtime errors or stack overflows. Depending on the criticality level of the software the absence of safety hazards has to be demonstrated by formal methods or testing with sufficient coverage.

DO-178B

Published in 1992, DO-178B (“Software Considerations in Airborne Systems and Equipment Certification”) was the primary document by which the certification authorities such as FAA, EASA and Transport Canada approved all commercial software-based aerospace systems until the publication of DO-178C.

The purpose of D0-178B is “to provide guidelines for the production of software for airborne systems and equipment that performs its intended function with a level of confidence in safety that complies with airworthiness requirements.”

DO-178B emphasizes the importance of software verification, defined as a technical assessment of the results of both the software development processes and the software verification process.

Section 6.0 states that “verification is not simply testing. Testing, in general, cannot show the absence of errors.” The standard consequently uses the term verify instead of test when the software verification process objectives being discussed are typically a combination of reviews, analyses and test.

The purpose of the verification process is to detect and report errors that may have been introduced during the software development processes. Removal of the errors is an activity of the software development processes.

The general objective of the verification process is ensuring that the requirements of the system level, the architecture level, the source-code level and the executable-object-code level are satisfied, as well as that the means used to satisfy these objectives are technically correct and complete.

At the code level, the more specific objective is to detect and report errors that may have been introduced during the software coding process.

The non-functional safety properties are explicitly listed as a part of the accuracy and consistency verification objective at the code level. This includes stack usage, worst-case execution timing, and absence of runtime errors.

DO-178C and DO-333

Published in 2011, DO-178C is a revision of DO-178B that accounts for progress in software development and verification technologies.

In general, DO-178-C aims at providing “guidance for determining, in a consistent manner and with an acceptable level of confidence, that the software aspects of airborne systems and equipment comply with airworthiness requirements.”

It specifically focuses on model-based software development, object-oriented software, the use and qualification of software tools and the use of formal methods to complement or replace dynamic testing (theorem proving, model checking, and abstract interpretation).

Each of these key aspects is addressed by a dedicated supplement which modifies, complements, and completes the DO-178C core document.

In this overview we will specifically focus on the DO-178C core document and DO-333 (Formal Methods Supplement to DO-178C and DO-278A).

aiT, StackAnalyzer and Astrée all include a Safety Manual that describes the qualification data required by DO-330 (Software Tool Qualification Considerations), and how our QSKs and QSCLCD cover these requirements.

System aspects relating to software development

System aspects relevant for software development include functional and operational requirements, performance requirements and safety-related requirements, including design constraints and design methods, such as partitioning (cf. Sec. 2.1). Timing and performance characteristics require special attention since they affect the systemsoftware and the software-hardware boundaries and have to be included in the respective information flows.

The DO-178C defines five software levels (criticality levels) ranging from Level A (most critical) to Level E (least critical). According to Sec. 2.3 only partitioned software components can be assigned individual software levels by the system safety assessment process. Sec. 2.4, Architectural Considerations, states that “if partitioning and independence between software components cannot be demonstrated, all components are assigned the software level associated with the most severe failure condition to which the software can contribute”. The standard defines partitioning as a “technique for providing isolation between software components to contain and/or isolate faults and potentially reduce the effort of the software verification process”. Among others, a partitioned software component “should not be allowed to contaminate another partitioned software component?s code, input/output, or data storage areas”, and it “should be allowed to consume shared processor resources only during its scheduled period of execution”. Thus, freedom of interference is an important architectural property.

Formal methods in the software life cycle (DO-333)

DO-333 defines formal methods as “mathematically based techniques for the specification, development, and verification of software aspects of digital systems”. It distinguishes three categories of formal analyses: deductive methods, such as theorem proving, model checking, and abstract interpretation. Abstract interpretation, in practice, is “used for constructing semantics-based analysis algorithms for the automatic, static, and sound determination of dynamic properties of infinite-state programs”. The computation of worst-case execution time bounds and the maximal stack usage are listed as reference applications of abstract interpretation. The importance of soundness is emphasized: “an analysis method can only be regarded as formal analysis if its determination of a property is sound. Sound analysis means that the method never asserts a property to be true when it is not true.”

Software verification process

Like the DO-178B, the DO-178C addresses the incompleteness of testing techniques: “Verification is not simply testing. Testing, in general, cannot show the absence of errors”. Since formal methods are sound they can completely satisfy some verification objectives while for others additional verification such as complimentary testing may be necessary. Purpose and objective of the software verification process are defined in the same way as in the DO-178B: The purpose of the software verification process is to detect and report errors that may have been introduced during the software development processes. Removal of the errors is an activity of the software development processes. The general objectives of the software verification process are to verify that the requirements of the system level, the architecture level, the source code level and the executable object code level are satisfied, and that the means used to satisfy these objectives are technically correct and complete.

Non-functional software properties can affect the system and the software level, and consequently, they are addressed at all levels of the software verification process. The DO-178C distinguishes between high-level requirements and low-level requirements. High-level requirements are produced directly through analysis of system requirements and system architecture. Low-level requirements are software requirements from which source code can be directly implemented without further information. High-level and low-level requirements blend together when source code is generated directly from high-level requirements.

The software verification process comprises reviews and analyses of high-level requirements, low-level requirements, the software architecture, the source code, and requires testing or formal analysis of the executable object code. One common verification objective at all levels is to demonstrate the compliance with the requirements of the parent level. As the system requirements include performance and safety-related requirements

DO-178C and DO-333

Non-functional aspects like timing or storage usage can impact all stages. Consequently, the “compatibility with the target computer” is a verification objective among the highlevel requirements, the low-level requirements, and at the software architecture level. In Sec. 6.3. the system response time is given as an example of target computer properties relevant for high-level requirements and low-level requirements. Computing the response time requires the worst-case execution time to be known. At the source-code level the objective accuracy and consistency explicitly includes determining the worst-case execution time, the stack usage, and runtime errors (memory usage, fixed-point arithmetic overflow and resolution, floating-point arithmetic, use of uninitialized variables). All these characteristics can be checked using formal analysis (cf. Sec. FM.6.3.4). Furthermore data and control flow analysis is required at the software architectural level and the source code level to ensure implementation consistency.

Worst-case execution time and worst-case stack usage have to be considered at the Executable Object Code level. The reason is that the impact of the compiler, linker, and of hardware features on the worst-case execution time and stack usage has to be assessed. Both can be checked by formal analyses at the Executable Object Code level (cf. Sec. FM.6.7). Runtime errors also have to be addressed at the Executable Object level, e.g. to deal with robustness issues like out-of-range loop values and arithmetic overflows (cf. Sec. FM.6.7.b), or to verify the software component integration. The latter implies, e.g., detecting incorrect initialization of variables, parameter passing errors, and data corruption.

At the Executable Object Level complimentary testing is still required, e.g., to address transient hardware faults, or incorrect interrupt handling (cf. Sec. 6.4.3.). Formal analysis performed at the source code level can be used for verification objectives at the executable object code if property preservation between source code and executable code can be demonstrated (cf. Sec. FM.6.7.f).

Considerations for Formal Methods

Regarding the applicability of formal methods, the DO-333 states that “formal methods provide comprehensive assurance of particular properties only for those aspects that are formalized in the formal model, so defining the limits of the model is essential.” Formal analyses at the source code level have to be based on a formal source code semantics. For formal analyses done at the object code level, the object code becomes a formal model the semantics of which are treated as they are by the target hardware. When formal analysis is used to meet a verification objective, it has to be ensured that each formal method used is correctly defined, justified, and appropriate to meet this verification objective (cf. Sec. FM.6.2.1):

At the Executable Object Level, coverage analysis depends on whether formal methods are used to complement or to replace dynamic testing methods. When any low-level testing is used to verify low-level requirements for a software component then the entire software component will be subject to test coverage analysis consisting of requirements-based coverage analysis and structural coverage analysis. Requirements-based coverage analysis establishes that verification evidence exists for all of the requirements of the system. Structural coverage analysis is necessary since no exhaustive testing is achievable and used to ensure that the testing performed is rigorous and sufficient (cf. Sec. 6.4., Sec. FM.6.7.1 and FM.12.3.5).

When only formal methods are used to verify requirements for a software component, a different coverage analysis for that component has to be performed, consisting of the following steps (cf. Sec. FM.6.7.2):

Verification Objectives

In general, aiT, StackAnalyzer, and Astrée contribute to all objectives related to the target environment. Stack usage, response times and execution times are determined by the target computer. They can cause violations of high-level requirements, violations of low-level requirements, incompatibility of the software architecture with the target computer, and they can affect the accuracy and consistency of the source code. The source code analysis of Astrée provides a formal data and control flow analysis which is required to demonstrate consistency between software architecture level and source code level. The main goal of Astrée, detecting runtime errors is required to deal with robustness issues like out-of-range loop values and arithmetic overflows, or to verify the software integration. The latter implies, e.g., detecting incorrect initialization of variables, parameter passing errors, and data corruption.

Here is a list of the sections of the DO-178C and the DO-333 that address verification objectives to which aiT, StackAnalyzer or Astrée can contribute:

The relevant tables are:

aiT StackAnalyzer Astrée
FM.A-3 Objective 3
High-level requirements are compatible with target computer
+++
FM.A-4 Objective 3
Low-level requirements are compatible with target computer
+++
FM.A-4 Objective 9
Software architecture is consistent
+
FM.A-4 Objective 10
Software architecture is compatible with target computer
+++
FM.A-5 Objective 6
Source code is accurate and consistent
+++
FM.A-6 Objective 3
Executable code complies with low-level requirements
++
FM.A-6 Objective 4
Executable code is robust with low-level requirements
++
FM.A-6 Objective 5
Executable code is compatible with target computer
++

While the tables in Annex-A regard the DO-178C, Annex-C contains the equivalent tables regarding DO-278A. They are relevant when using formal methods in the production of software for Communication, Navigation, Surveillance, and Air Traffic Management (CNS/ATM) systems. The applicability of aiT, StackAnalyzer and Astrée wrt. the tables in Annex-C is identical to the corresponding tables in Annex-A.