Your browser does not support JavaScript! Skip to main content
Free 30-day trial DO-178C Handbook RapiCoupling Preview DO-178C Multicore Training Multicore Resources
Rapita Systems
 

Industry leading verification tools & services

Rapita Verification Suite (RVS)

  RapiTest - Unit/system testing  RapiCover - Structural coverage analysis  RapiTime - Timing analysis (inc. WCET)  RapiTask - Scheduling visualization  RapiCoverZero - Zero footprint coverage analysis  RapiTimeZero - Zero footprint timing analysis  RapiTaskZero - Zero footprint scheduling analysis  RapiCouplingPreview - DCCC analysis

Multicore Verification

  MACH178  MACH178 Foundations  Multicore Timing Solution  RapiDaemons

Engineering Services

  V&V Services  Data Coupling & Control Coupling  Object code verification  Qualification  Training  Consultancy  Tool Integration  Support

Industries

  Civil Aviation (DO-178C)   Automotive (ISO 26262)   Military & Defense   Space

Other

RTBx Mx-Suite Software licensing Product life cycle policy RVS Assurance issue policy RVS development roadmap

Latest from Rapita HQ

Latest news

SAIF Autonomy to use RVS to verify their groundbreaking AI platform
RVS 3.22 Launched
Hybrid electric pioneers, Ascendance, join Rapita Systems Trailblazer Partnership Program
Magline joins Rapita Trailblazer Partnership Program to support DO-178 Certification
View News

Latest from the Rapita blog

How to certify multicore processors - what is everyone asking?
Data Coupling Basics in DO-178C
Control Coupling Basics in DO-178C
Components in Data Coupling and Control Coupling
View Blog

Latest discovery pages

control_tower DO-278A Guidance: Introduction to RTCA DO-278 approval
Picture of a car ISO 26262
DCCC Image Data Coupling & Control Coupling
Additional Coe verification thumb Verifying additional code for DO-178C
View Discovery pages

Upcoming events

DASC 2025
2025-09-14
DO-178C Multicore In-person Training (Fort Worth, TX)
2025-10-01
DO-178C Multicore In-person Training (Toulouse)
2025-11-04
HISC 2025
2025-11-13
View Events

Technical resources for industry professionals

Latest White papers

Mitigation of interference in multicore processors for A(M)C 20-193
Sysgo WP
Developing DO-178C and ED-12C-certifiable multicore software
DO178C Handbook
Efficient Verification Through the DO-178C Life Cycle
View White papers

Latest Videos

Rapita Systems - Safety Through Quality
Simulation for the Motorola 68020 microprocessor with Sim68020
AI-driven Requirements Traceability for Faster Testing and Certification
Multicore software verification with RVS 3.22
View Videos

Latest Case studies

GMV case study front cover
GMV verify ISO26262 automotive software with RVS
Kappa: Verifying Airborne Video Systems for Air-to-Air Refueling using RVS
Supporting DanLaw with unit testing and code coverage analysis for automotive software
View Case studies

Other Resources

 Webinars

 Brochures

 Product briefs

 Technical notes

 Research projects

 Multicore resources

Discover Rapita

Who we are

The company menu

  • About us
  • Customers
  • Distributors
  • Locations
  • Partners
  • Research projects
  • Contact us

US office

+1 248-957-9801
info@rapitasystems.com
Rapita Systems, Inc.
41131 Vincenti Ct.
Novi
MI 48375
USA

UK office

+44 (0)1904 413945
info@rapitasystems.com
Rapita Systems Ltd.
Atlas House
Osbaldwick Link Road
York, YO10 3JB
UK

Spain office

+34 93 351 02 05
info@rapitasystems.com
Rapita Systems S.L.
Parc UPC, Edificio K2M
c/ Jordi Girona, 1-3
Barcelona 08034
Spain

Working at Rapita

Careers

Careers menu

  • Current opportunities & application process
  • Working at Rapita
Back to Top Contact Us

Generating low level tests from system tests

Breadcrumb

  1. Home
Jonny Woodrow
2020-07-07

In software development, system testing can identify issues with the software, but is often an expensive process. We've been working on a tool that lets you capture data from system tests to automatically generate unit tests that test software functions in an equivalent way. This could be used to minimize the engineering effort needed to test software functionality across the software development life cycle.

Software system tests are often used to identify issues with the tested software. These tests, which often involve manual processes by test engineers, are effort-intensive to run. Lower level tests such as unit tests typically require less effort to run, but often must be written by hand.

We've been working on a tool that lets you automatically generate unit tests that test software in an equivalent way to system tests. As the unit tests are automatically generated and easier to run, this means that you can save a huge amount of time by not needing to rerun your system tests every time you change your code.

How does it work?

To capture values from system tests, RVS statically analyzes the source code and instruments it so that it can observe the inputs and outputs of functions, whether they are parameters or global variables. This allows it to follow pointers, and unroll any arrays and structures needed to understand the program behavior. It can even detect ‘volatile’ results (such as the creation of a file handle or a malloc pointer).

Then, when the system test is run, RVS captures significant events or "highlights" that you can play back as stand-alone unit tests (Figure 1). System tests may observe dozens of significant events for a specific function over the course of minutes or even hours, while a unit test will execute significant events end-to-end, significantly reducing the time needed to test for them.

Figure 1. Capturing highlights from system tests to generate unit tests

After capturing data from the system test, you can select functions for which to create unit tests. RVS will then look at the captured system test data including observed input and output parameters for the function, global state changes and the call-stack, and will generate RapiTest unit tests that mimic the behavior observed during the system test.

When generating tests for a function, you can select which of the observed values to use as criteria for generating new unique tests. For example, you can configure tests to be automatically generated based on differences in coverage, the data being fed to the function, the global state, or the underlying call-stack of the function.

The tests generated by RapiTest can then be run through an automated environment. Figure 2 shows part of a test spreadsheet generated from system tests, where the criteria for each "unique" test were the data passed to the function under test.

Figure 2. Values of arrays, pointers and structure elements captured by the system to unit converter

Selecting acceptance criteria

For a test to be meaningful, it must include acceptance criteria. These depend on the test, the application, and the environment in which the test is run. Some tests, when executed, will pass merely if the function under test exits without error, while others may be much more complex.

When capturing the data for a function, RVS observes the inputs, outputs, global state and call-stack of functions, and their timing and coverage behavior. When generating unit tests for a function, you can select which of the observed values to use as acceptance criteria for your tests. In testing terms, this could be considered to be "oracle" testing – if the system test RVS captured results from worked when some input produced the observed output, the unit test should ensure that those inputs consistently produce the same outputs every run.

Figure 3 gives an example of automatically generated test acceptance criteria. This test automatically checks changes in values when the test function is executed with a specific input pattern.

Figure 3. Generating unit tests from captured values

Running the tests

While the unit tests generated by RapiTest are not traceable to requirements, they could let you take that three-hour system test that you have to run every release, extract the "best bits" of it to generate a few milliseconds' worth of unit tests, and run them as regression tests on a continuous integration server. This would let you, for example, set up regression tests for software functionality you've already completed. If any changes to your system cause your software's functionality to change, your regression tests would automatically flag these for your attention, letting you avoid the risk of nasty surprises towards the end of your testing cycle.

This technology offers a flexible approach to testing throughout development and could be used in many ways. It can be combined with other test-generation facilities such as bounds checking, fuzz testing and auto-generation for coverage (up to MC/DC) to provide a robust verification layer before formal testing even begins, greatly reducing the cost of testing.

We plan to release this technology officially in a future version of RVS. In the meantime, if you want more information about it, or have any ideas on how you would use the technology, contact us.

DO-178C webinars

DO178C webinars

White papers

Mitigation of interference in multicore processors for A(M)C 20-193
Sysgo WP Developing DO-178C and ED-12C-certifiable multicore software
DO178C Handbook Efficient Verification Through the DO-178C Life Cycle
A Commercial Solution for Safety-Critical Multicore Timing Analysis

Related blog posts

Metrowerks CodeTest - How and why to upgrade

.
2021-02-01

Race condition testing

.
2019-11-20

False positive and false negative in software testing

.
2019-05-22

Automating test generation with AUTOSAC

.
2017-07-20

Pagination

  • Current page 1
  • Page 2
  • Next page Next ›
  • Last page Last »
  • Solutions
    • Rapita Verification Suite
    • RapiTest
    • RapiCover
    • RapiTime
    • RapiTask
    • MACH178

    • Verification and Validation Services
    • Qualification
    • Training
    • Integration
  • Latest
  • Latest menu

    • News
    • Blog
    • Events
    • Videos
  • Downloads
  • Downloads menu

    • Brochures
    • Webinars
    • White Papers
    • Case Studies
    • Product briefs
    • Technical notes
    • Software licensing
  • Company
  • Company menu

    • About Rapita
    • Careers
    • Customers
    • Distributors
    • Industries
    • Locations
    • Partners
    • Research projects
    • Contact
  • Discover
    • Multicore Timing Analysis
    • Embedded Software Testing Tools
    • Worst Case Execution Time
    • WCET Tools
    • Code coverage for Ada, C & C++
    • MC/DC Coverage
    • Verifying additional code for DO-178C
    • Timing analysis (WCET) & Code coverage for MATLAB® Simulink®
    • Data Coupling & Control Coupling
    • Aerospace Software Testing
    • Automotive Software Testing
    • Certifying eVTOL
    • DO-178C
    • AC 20-193 and AMC 20-193
    • ISO 26262
    • What is CAST-32A?

All materials © Rapita Systems Ltd. 2025 - All rights reserved | Privacy information | Trademark notice Subscribe to our newsletter