Updated README file.

This commit is contained in:
LoRd_MuldeR 2018-12-10 21:49:09 +01:00
parent c1e820fb04
commit 240b117c62
3 changed files with 33 additions and 25 deletions

View File

@ -1,17 +1,21 @@
TimedExec
=========
% ![](img/timedexec/banner.jpg)
TimedExec – README
% by LoRd_MuldeR &lt;<mulder2@gmx>&gt; | <http://muldersoft.com/>
**TimedExec** is a small utility for *benchmarking* command-line programs. It will *execute* the specified program with the specified command-line arguments and then *measure* the time that it takes for the program execution to complete. In order to obtain *accurate* results, all measurements are implemented via *high-resolution* performance timers. And, since program execution times unavoidably are subject to variations (due to environmental noise), each test will be repeated *multiple* times. The number of metering passes can be configured as desired. TimedExec will then compute the *mean* execution time of all passes. It will also record the *fastest* and *slowest* execution time. Furthermore, TimedExec computes the *standard error* as well as the *confidence interval* from the benchmarking results. This is the *range* which contains the program's actual (mean) execution time, *with very high probability*. Last but not least, an optional number of "warm-up" passes can be performed *prior to* the first metering pass. The warm-up passes prevent caching effects from interfering with the execution times. Note that all benchmarking results will be saved to a log file.
Introduction
============
**TimedExec** is a small utility for *benchmarking* command-line programs. It will *execute* the specified program with the specified command-line arguments and then *measure* the time that it takes for the execution to complete. In order to obtain *accurate* results, all measurements are implemented via *high-resolution* performance timers. And, since program execution times unavoidably are subject to certain variations (e.g. due to environmental noise), each test will be repeated *multiple* times. The number of metering passes can be configured as desired. Optionally, a number of "warm-up" passes can be performed *prior to* the first metering pass. The warm-up passes prevent caching effects from interfering with the execution times.
TimedExec will then compute the ***mean*** execution time as well as the ***median*** execution time of all metering passes. It will also record the *fastest* and *slowest* execution time that has been measured. Furthermore, TimedExec computes the *standard error* in order to determine ***confidence intervals*** from the benchmarking results^[[Konfidenzintervalle so einfach wie möglich erklärt](http://www.uni-siegen.de/phil/sozialwissenschaften/soziologie/mitarbeiter/ludwig-mayerhofer/statistik/statistik_downloads/konfidenzintervalle.pdf)]. These are the *ranges* which contain the program's “real” average execution time (expected value), *with very high probability*. All results will be saved to a log file.
Usage Instructions
------------------
==================
### Command-line Syntax ###
*TimedExec* uses a very simple command-line syntax. Just type **`TimedExec`**, followed by the program that you want to benchmark. Optionally, any number arguments can be appended; these parameters will be passed to the program.
TimedExec uses a very simple command-line syntax. Just type "TimedExec", followed by the program that you want to benchmark, followed by the desired arguments.
*Note:* Some parameters that influence the behaviour of TimedExec can be set via environment variables. Must be set *before* running the application.
***Note:*** Some options that influence the behavior of TimedExec can be controlled via environment variables.
```
===============================================================================
@ -33,15 +37,20 @@ Influential environment variables:
TIMED_EXEC_NO_CHECKS - Set this to *disable* exit code checks
```
### Usage Example ###
Usage Example
-------------
In the following example we use *TimedExec* to benchmark the program **ping.exe** with the arguments **-n 12 www.google.com**. The command will be executed ten times, by default:
In the following example we use *TimedExec* to benchmark the program **`ping.exe`** with the arguments **`-n 12 www.google.com`**. By default, the command will be executed *five* times, preceded by a single "warm-up" pass:
```
TimedExec.exe C:\Windows\System32\ping.exe -n 12 www.google.com
```
The resulting output, after all ten passes have been completed, may look like this:
Results
=======
The resulting output, after all metering passes have been completed, looks like this:
```
===============================================================================
TEST COMPLETED SUCCESSFULLY AFTER 5 METERING PASSES
@ -57,8 +66,18 @@ Fastest / Slowest Pass : 11.253 / 11.270 seconds
===============================================================================
```
Interpretation
--------------
When comparing measurement results, the ***mean*** (average) execution time may seem like the most obvious choice. However, it has to be noted that the *mean* of a data sample is highly sensitive to “outliers” and therefore can be misleading! This is especially true, when there exists a lot of variation in the data sample. Consequently, comparing the ***median*** execution times usually is the preferable choice. That is because the *median* of a data sample is much more robust against outliers.
Furthermore, it is important to keep in mind that the *mean* (or *median*) execution time computed from a limited number of metering passes only yields an ***estimate*** of the program's “real” average execution time (expected value). The “real” value can only be determined accurately from an *infitinte* number of metering passes &ndash; which is **not** possible in practice. In this situation, we can have a look at the ***confidence intervals***. These intervals contain the “real” value, *with very high probability*. The most commonly used *confidence interval* is the “95%” one (higher confidence means broader interval, and vice versa).
Simply put, as long as the confidence intervals of program A and program B *overlap* (at least partially), we **must not** conclude that either of these programs runs faster (or slower) in the average case. ***No*** conclusion can be drawn in that case!
Sources
-------
=======
The *TimedExec* source codes are managed by [**Git**](http://git-scm.com/doc) and are available from one of the official mirrors:
@ -66,8 +85,9 @@ The *TimedExec* source codes are managed by [**Git**](http://git-scm.com/doc) an
* <tt>https://bitbucket.org/muldersoft/timedexec.git</tt> ([Browse](https://bitbucket.org/muldersoft/timedexec))
* <tt>https://gitlab.com/timedexec/timedexec.git</tt> ([Browse](https://gitlab.com/timedexec/timedexec))
License
-------
=======
TimedExec is released under the terms of the [GNU General Public License](http://www.gnu.org/licenses/gpl-2.0.html), version 2.

View File

@ -1,12 +0,0 @@
<style type="text/css">
<!--
body { font-family: "Times New Roman", Times, serif; color: #000000; background-color: #FFFFFF; }
tt, pre, code { font-family: Courier New, Courier, mono; background-color: #EDF3F7; padding: 1px; }
h2 { margin-top: 2.0em; }
h3, h4 { margin-top: 1.75em; }
a { color: #0000BB; text-decoration: none; }
a:visited { color: #0000BB; text-decoration: none; }
a:active { color: #0000FF; text-decoration: none; }
a:hover { color: #0000FF; text-decoration: underline; }
-->
</style>

BIN
img/timedexec/banner.jpg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 42 KiB