Your browser does not support JavaScript! Skip to main content
Free 30-day trial DO-178C Handbook RapiCoupling Preview DO-178C Multicore Training Multicore Resources
Rapita Systems
 

Industry leading verification tools & services

Rapita Verification Suite (RVS)

  RapiTest - Unit/system testing  RapiCover - Structural coverage analysis  RapiTime - Timing analysis (inc. WCET)  RapiTask - Scheduling visualization  RapiCoverZero - Zero footprint coverage analysis  RapiTimeZero - Zero footprint timing analysis  RapiTaskZero - Zero footprint scheduling analysis  RapiCouplingPreview - DCCC analysis

Multicore Verification

  MACH178  MACH178 Foundations  Multicore Timing Solution  RapiDaemons

Engineering Services

  V&V Services  Data Coupling & Control Coupling  Object code verification  Qualification  Training  Consultancy  Tool Integration  Support

Industries

  Civil Aviation (DO-178C)   Automotive (ISO 26262)   Military & Defense   Space

Other

RTBx Mx-Suite Software licensing Product life cycle policy RVS Assurance issue policy RVS development roadmap

Latest from Rapita HQ

Latest news

SAIF Autonomy to use RVS to verify their groundbreaking AI platform
RVS 3.22 Launched
Hybrid electric pioneers, Ascendance, join Rapita Systems Trailblazer Partnership Program
Magline joins Rapita Trailblazer Partnership Program to support DO-178 Certification
View News

Latest from the Rapita blog

How to certify multicore processors - what is everyone asking?
Data Coupling Basics in DO-178C
Control Coupling Basics in DO-178C
Components in Data Coupling and Control Coupling
View Blog

Latest discovery pages

control_tower DO-278A Guidance: Introduction to RTCA DO-278 approval
Picture of a car ISO 26262
DCCC Image Data Coupling & Control Coupling
Additional Coe verification thumb Verifying additional code for DO-178C
View Discovery pages

Upcoming events

XPONENTIAL 2025
2025-05-19
Avionics and Testing Innovations 2025
2025-05-20
DASC 2025
2025-09-14
DO-178C Multicore In-person Training (Fort Worth, TX)
2025-10-01
View Events

Technical resources for industry professionals

Latest White papers

Mitigation of interference in multicore processors for A(M)C 20-193
Sysgo WP
Developing DO-178C and ED-12C-certifiable multicore software
DO178C Handbook
Efficient Verification Through the DO-178C Life Cycle
View White papers

Latest Videos

Rapita Systems - Safety Through Quality
Simulation for the Motorola 68020 microprocessor with Sim68020
AI-driven Requirements Traceability for Faster Testing and Certification
Multicore software verification with RVS 3.22
View Videos

Latest Case studies

GMV case study front cover
GMV verify ISO26262 automotive software with RVS
Kappa: Verifying Airborne Video Systems for Air-to-Air Refueling using RVS
Supporting DanLaw with unit testing and code coverage analysis for automotive software
View Case studies

Other Resources

 Webinars

 Brochures

 Product briefs

 Technical notes

 Research projects

 Multicore resources

Discover Rapita

Who we are

The company menu

  • About us
  • Customers
  • Distributors
  • Locations
  • Partners
  • Research projects
  • Contact us

US office

+1 248-957-9801
info@rapitasystems.com
Rapita Systems, Inc.
41131 Vincenti Ct.
Novi
MI 48375
USA

UK office

+44 (0)1904 413945
info@rapitasystems.com
Rapita Systems Ltd.
Atlas House
Osbaldwick Link Road
York, YO10 3JB
UK

Spain office

+34 93 351 02 05
info@rapitasystems.com
Rapita Systems S.L.
Parc UPC, Edificio K2M
c/ Jordi Girona, 1-3
Barcelona 08034
Spain

Working at Rapita

Careers

Careers menu

  • Current opportunities & application process
  • Working at Rapita
Back to Top Contact Us

Explaining the difference between Execution Times and Response Times

Breadcrumb

  1. Home
2010-09-16

Execution time and response time are two concepts which are sometimes mistakenly conflated. In this post, I will make the distinction between the two clear and explain why both concepts are important in real-time embedded systems design.

First, some terminology. A task is a piece of code that is to be run within a single thread of execution. A task issues a sequence of jobs to the processor which are queued and executed.

The time spent by the job actively using processor resources is its execution time. The execution time of each job instance from the same task is likely to differ.

Common sources of variation are path data dependencies (the path taken through the code depends on input parameters) and hard-to-predict hardware features such as branch prediction, instruction pipelining and caches.

The response time for a job is the time between when it becomes active (e.g. an external event or timer triggers an interrupt) and the time it completes. Several factors can cause the response time of a job to be longer than its execution time - Figure 1 shows some of these:

Execution times vs response times
Figure 1: Scheduling behavior of RTOS

In Figure 1, jobs in the queue are scheduled using fixed priority pre-emptive scheduling. Execution and response times are shown for the medium priority job. The real-time operating system (RTOS) scheduler always selects the highest priority job that is ready to run next. A job is suspended if a higher priority one becomes active, and resumes after all higher priority jobs have completed.

A lower priority job can also prevent a job from running if it locks a shared resource before the higher priority job does. This is called priority inversion. RTOS overheads for context switches and pre-emptions will also delay a job. These may be very small with appropriate hardware support. Release jitter caused by insufficient clock granularity is another source of delay (not shown above) [1].

Both execution times and response times are of interest to real-time systems designers. This is usually in the context of worst-case execution times (WCETs) and worst-case response times (WCRTs). High level system requirements will specify maximum response times for a task, known as a deadline. WCRTs are calculated using response time analysis, which takes WCETs and a scheduling policy as inputs. This may lead to execution time budgets and a scheduling policy being derived as lower level requirements.

Being able to measure response times and execution times individually is important. If response times are measured but execution times are not, then it is not possible to perform worst-case response time analysis. This runs the risk of the system missing a deadline because a particular job sequence / job execution time combination was not encountered in testing. Response time measurement data is still useful, however, for knowing how close jobs are to missing deadlines.

And finally… in certain circumstances execution time increases may even lead to response time decreases! Using our diagram once again, imagine that a previous job of the high priority task ran until just after the activation of the medium priority task. The low priority task which blocked the medium priority one would not be allowed to execute, allowing the medium priority task to both start and complete earlier. We suggest Baruah and Burns' paper[2] on sustainable scheduling analysis as further reading.


[1] Neil Audsley, Iain Bate and Alan Burns, "Putting fixed priority scheduling theory into engineering practice for safety critical applications", In Proceedings of the 2nd IEEE Real-Time Technology and Applications Symposium (RTAS '96), pages 2-10, 1996.

[2] Sanjoy Baruah and Alan Burns, "Sustainable Scheduling Analysis", In Proceedings of the 27th IEEE International Real-Time Systems Symposium (RTSS 2006), pages 159-168, 2006.

DO-178C webinars

DO178C webinars

White papers

Mitigation of interference in multicore processors for A(M)C 20-193
Sysgo WP Developing DO-178C and ED-12C-certifiable multicore software
DO178C Handbook Efficient Verification Through the DO-178C Life Cycle
A Commercial Solution for Safety-Critical Multicore Timing Analysis

Related blog posts

Measuring response times and more with RapiTime

.
2023-03-10

Out of the box RVS integration for DDC-I's Deos RTOS

.
2020-02-23

WCET analysis of object code with zero instrumentation

.
2017-02-27

What happened first? Handling timer wraparound

.
2016-01-08

Pagination

  • Current page 1
  • Page 2
  • Page 3
  • Page 4
  • Page 5
  • Page 6
  • Page 7
  • Page 8
  • Page 9
  • Next page Next ›
  • Last page Last »
  • Solutions
    • Rapita Verification Suite
    • RapiTest
    • RapiCover
    • RapiTime
    • RapiTask
    • MACH178

    • Verification and Validation Services
    • Qualification
    • Training
    • Integration
  • Latest
  • Latest menu

    • News
    • Blog
    • Events
    • Videos
  • Downloads
  • Downloads menu

    • Brochures
    • Webinars
    • White Papers
    • Case Studies
    • Product briefs
    • Technical notes
    • Software licensing
  • Company
  • Company menu

    • About Rapita
    • Careers
    • Customers
    • Distributors
    • Industries
    • Locations
    • Partners
    • Research projects
    • Contact
  • Discover
    • Multicore Timing Analysis
    • Embedded Software Testing Tools
    • Worst Case Execution Time
    • WCET Tools
    • Code coverage for Ada, C & C++
    • MC/DC Coverage
    • Verifying additional code for DO-178C
    • Timing analysis (WCET) & Code coverage for MATLAB® Simulink®
    • Data Coupling & Control Coupling
    • Aerospace Software Testing
    • Automotive Software Testing
    • Certifying eVTOL
    • DO-178C
    • AC 20-193 and AMC 20-193
    • ISO 26262
    • What is CAST-32A?

All materials © Rapita Systems Ltd. 2025 - All rights reserved | Privacy information | Trademark notice Subscribe to our newsletter