Measuring Performance for Real-Time Systems

Real-time computer systems are all around us, in large and small applications. The control and monitoring system of a nuclear power plant is an example of a real-time system. So is a pacemaker or a real-time first person video game. What makes these systems unique is the fact that time is used to dictate whether they operate correctly, or not. In the examples above, these timings are critical. If the nuclear control and monitoring system does not respond to meltdown conditions quickly enough, then the impact is catastrophic. If a pacemaker does not respond to changing conditions fast enough, then the impact is catastrophic for the patient. For the gamer, a system that does not respond in a timely manner normally leads to a lost life in the virtual world.

Okay, so the gamer losing a virtual life is not so critical. Or is it? If this system does not meet the performance criteria of the customer base, then they will choose not to use it. Therefore, the real-time characteristics of the game, which are essential for customer satisfaction, become business critical.

So time is an essential part of the definition of a real-time computer system, and software execution performance becomes all-important. Performance refers to the response time or throughput as seen by the users. But how do you build performance into an application? For that matter, how do you test the performance of an application? What about when your application works correctly, but not quickly enough?

Real-time systems present unique challenges to the definition, development and testing of software. This is the focus of this paper.

There are many issues surrounding the manufacture of real-time systems, and this paper aims to identify them and to discuss possible solutions. The solutions discussed are based primarily on software performance measurement techniques. For multi-tasking applications, however, there are no metrics that can be used to guarantee performance. In this realm, the paper discusses the use of algebraic prediction methods such as Rate Monotonic Analysis and investigates how the accuracy of these techniques may be affected by using the metrics obtained using the performance measurement techniques discussed.

Starting from the perspective of the developer, the principles of Software Performance Engineering and how these may be employed to build performance into an application are described. For those applications that have already passed this stage, some of the aspects of software design and implementation that affect performance, such as multi-tasking and dynamic memory usage are then described, and how these may be resolved.

Throughout this paper, the hardware assisted analysis methods utilized by the Freescale Semiconductor CodeTEST® product are used as an example of how measurements pertaining to software performance may be made.

View Entire Paper | Previous Page | White Papers Search

If you found this page useful, bookmark and share it on: