What is DO-178C testing?

Discover DO-178C testing

DO-178C Guidance

DO-178C testing is part of the overall process of complying with the guidance in DO178C, Software Considerations in Airborne Systems and Equipment Certification. DO-178C is the primary document by which the certification authorities such as FAA, EASA and Transport Canada approve all commercial software-based aerospace systems. More recently, it has also become the de facto approach for the use of software in military avionics systems worldwide.

DO-178C Verification

The overall DO-178C guidance consists of six key areas: planning, development, verification, configuration management, quality assurance and certification liaison. Testing forms a part – but not the whole – of verification. While testing follows development in the software life cycle, verification is really a concurrent process that carries on throughout. The planning stage of DO-178C, for example, requires development of a Software Verification Plan (SVP).

Verficiation includes:

  • Review - of plans, design artefacts and traceability
  • Testing - to software requirements
  • Analysis - where testing would be either inconclusive or too expensive to be conclusive

Upcoming training

DO-178C Training Workshop
Date: 10-Jul-19 to 11-Jul-19
Location: San Diego, USA
We're: Co-hosting

Design Assurance Levels

A key provision of DO-178C meant to facilitate cost-effective assurance is the definition of Design Assurance Levels (DAL). These five assurance levels are based on the consequences of potential software failure to the system as a whole and are determined by the system safety assessment process (which precedes the application of DO-178C). The five DALs, which are summarized in Table 1, determine the amount of rigour required in the development and testing of a specific piece of airborne software.

LevelFailure conditionFailure rateObjectivesWith independence
ACatastrophic≤ 1x10-97133
BHazardous≤ 1x10-76921
CMajor≤ 1x10-5628
ENo safety effectsN/A00

Table 1: DO-178C Design Assurance Levels (DAL)

The “Objectives” column of Table 1 lists the number of DO-178C objectives that must be met by the software in the overall DO-178C process. Some of these objectives are fulfilled in each of the six key areas, with verification fulfilling the greatest number.

The rightmost column of Table 1 indicates how many of these objectives must be met “with independence”, meaning that verification is performed by individuals who did not develop the software item under verification.1

The higher the DAL (Level A being the highest, Level E the lowest), the higher the amount of rigour, effort and documentation required when following the guidance in DO-178C.

Requirements management

Traceability forms the basis or “foundation” (in the architectural sense) of DO-178C development and verification. Each system requirement that will be realized by software must trace down to one or more high-level or derived software requirements, each of which in turn trace to one or more low-level requirements which then trace to source code. This is top-down traceability and must be demonstrated in requirements analysis.

It is also necessary to demonstrate bottom-up traceability. All source code must trace to and correctly fulfil low-level software requirements. All low-level requirements must trace to high-level or derived software requirements, and so forth, up to the system requirements.

Likewise, in verification, you must demonstrate the traceability of your test cases to requirements via requirements-based coverage analysis, and to code structure through structural coverage analysis. Test results, traceability data and coverage data together show that all implemented functionality traces back to requirements and all dead code has been eliminated.


DO-178C calls for significantly more software testing and, consequently, more test documentation as the criticality level of the software increases. No testing is required at Level E, since Level E software has no impact on safety. Testing to the software’s requirements forms the basis of DO-178C verification at Level D. Additional coverage requirements are added at subsequent assurance levels. At Levels C and above, for example, robustness testing must show that the software displays no untoward behaviour in the event of abnormal inputs or conditions. Table 2 summarizes the test coverage guidance for each Design Assurance Level.

DALCoverage requirements
Level ENo coverage
Level D100% requirements coverage
Level CLevel D plus robustness, data/control coupling and 100% statement coverage (SC)
Level BLevel C plus 100% decision coverage (DC)
Level ALevel B plus 100% modified condition/decision coverage (MC/DC) and verification of source-to-binary correlation

Table 2: DO-178C software verification coverage requirements by Design Assurance Level (DAL)

The standard defines each type of coverage. Statement coverage (SC), for example is defined as verification that every statement in the program has been invoked at least once. Table 3 summarizes the criteria for each type of coverage defined by DO-178C.

Coverage criteriaStatement coverageDecision coverageModified condition/Decision coverage
Every statement in the program has been invoked at least once  
Every point of entry and exit in the program has been invoked at least once 
Every decision in the program has reached all possible outcomes at least once 
Every condition in a decision in the program has reached all possible outcomes at least once  
Every condition in a decision has been shown to independently ffect that decision's outcome  

Table 3: Criteria for coverage as defined in DO-178C

Verification documentation

The verification documentation criteria in DO-178C include the following:

  • Software verification cases and procedures (SVCP)
  • Software verification results (SVR)
  • Reports on reviews of all requirements, design and code
  • Executable object code testing results
  • Code coverage analysis report

Final thoughts

DO-178C was developed by industry professionals with little government oversight. As such, it was intended to be both practical and cost-effective. It was designed to be flexible – in that it can be applied to virtually any development model – and to make airborne software as reliable as can be reasonably expected.

Because its verification process and objectives become increasingly rigorous at higher levels of safety criticality, however, it behoves organizations to plan their development and verification carefully, if they are required to follow DO-178C.

1The standard does not further define independence, as DO-178C limits itself to what must be done in developing airborne software and offers little information on how to do it. Fortunately, the international Certification Authorities Software Team (CAST) does clarify these boundaries in its position paper, CAST-26, on Verification Independence.