diff --git a/README.md b/README.md index 8a2fe5b..aa187b0 100644 --- a/README.md +++ b/README.md @@ -1,17 +1,21 @@ -TimedExec -========= +% ![](img/timedexec/banner.jpg) +TimedExec – README +% by LoRd_MuldeR <> | -**TimedExec** is a small utility for *benchmarking* command-line programs. It will *execute* the specified program with the specified command-line arguments and then *measure* the time that it takes for the program execution to complete. In order to obtain *accurate* results, all measurements are implemented via *high-resolution* performance timers. And, since program execution times unavoidably are subject to variations (due to environmental noise), each test will be repeated *multiple* times. The number of metering passes can be configured as desired. TimedExec will then compute the *mean* execution time of all passes. It will also record the *fastest* and *slowest* execution time. Furthermore, TimedExec computes the *standard error* as well as the *confidence interval* from the benchmarking results. This is the *range* which contains the program's actual (mean) execution time, *with very high probability*. Last but not least, an optional number of "warm-up" passes can be performed *prior to* the first metering pass. The warm-up passes prevent caching effects from interfering with the execution times. Note that all benchmarking results will be saved to a log file. +Introduction +============ + +**TimedExec** is a small utility for *benchmarking* command-line programs. It will *execute* the specified program with the specified command-line arguments and then *measure* the time that it takes for the execution to complete. In order to obtain *accurate* results, all measurements are implemented via *high-resolution* performance timers. And, since program execution times unavoidably are subject to certain variations (e.g. due to environmental noise), each test will be repeated *multiple* times. The number of metering passes can be configured as desired. Optionally, a number of "warm-up" passes can be performed *prior to* the first metering pass. The warm-up passes prevent caching effects from interfering with the execution times. + +TimedExec will then compute the ***mean*** execution time as well as the ***median*** execution time of all metering passes. It will also record the *fastest* and *slowest* execution time that has been measured. Furthermore, TimedExec computes the *standard error* in order to determine ***confidence intervals*** from the benchmarking results^[[Konfidenzintervalle so einfach wie möglich erklärt](http://www.uni-siegen.de/phil/sozialwissenschaften/soziologie/mitarbeiter/ludwig-mayerhofer/statistik/statistik_downloads/konfidenzintervalle.pdf)]. These are the *ranges* which contain the program's “real” average execution time (expected value), *with very high probability*. All results will be saved to a log file. Usage Instructions ------------------- +================== -### Command-line Syntax ### +*TimedExec* uses a very simple command-line syntax. Just type **`TimedExec`**, followed by the program that you want to benchmark. Optionally, any number arguments can be appended; these parameters will be passed to the program. -TimedExec uses a very simple command-line syntax. Just type "TimedExec", followed by the program that you want to benchmark, followed by the desired arguments. - -*Note:* Some parameters that influence the behaviour of TimedExec can be set via environment variables. Must be set *before* running the application. +***Note:*** Some options that influence the behavior of TimedExec can be controlled via environment variables. ``` =============================================================================== @@ -33,15 +37,20 @@ Influential environment variables: TIMED_EXEC_NO_CHECKS - Set this to *disable* exit code checks ``` -### Usage Example ### +Usage Example +------------- -In the following example we use *TimedExec* to benchmark the program **ping.exe** with the arguments **-n 12 www.google.com**. The command will be executed ten times, by default: +In the following example we use *TimedExec* to benchmark the program **`ping.exe`** with the arguments **`-n 12 www.google.com`**. By default, the command will be executed *five* times, preceded by a single "warm-up" pass: ``` TimedExec.exe C:\Windows\System32\ping.exe -n 12 www.google.com ``` -The resulting output, after all ten passes have been completed, may look like this: + +Results +======= + +The resulting output, after all metering passes have been completed, looks like this: ``` =============================================================================== TEST COMPLETED SUCCESSFULLY AFTER 5 METERING PASSES @@ -57,8 +66,18 @@ Fastest / Slowest Pass : 11.253 / 11.270 seconds =============================================================================== ``` +Interpretation +-------------- + +When comparing measurement results, the ***mean*** (average) execution time may seem like the most obvious choice. However, it has to be noted that the *mean* of a data sample is highly sensitive to “outliers” and therefore can be misleading! This is especially true, when there exists a lot of variation in the data sample. Consequently, comparing the ***median*** execution times usually is the preferable choice. That is because the *median* of a data sample is much more robust against outliers. + +Furthermore, it is important to keep in mind that the *mean* (or *median*) execution time computed from a limited number of metering passes only yields an ***estimate*** of the program's “real” average execution time (expected value). The “real” value can only be determined accurately from an *infitinte* number of metering passes – which is **not** possible in practice. In this situation, we can have a look at the ***confidence intervals***. These intervals contain the “real” value, *with very high probability*. The most commonly used *confidence interval* is the “95%” one (higher confidence means broader interval, and vice versa). + +Simply put, as long as the confidence intervals of program A and program B *overlap* (at least partially), we **must not** conclude that either of these programs runs faster (or slower) in the average case. ***No*** conclusion can be drawn in that case! + + Sources -------- +======= The *TimedExec* source codes are managed by [**Git**](http://git-scm.com/doc) and are available from one of the official mirrors: @@ -66,8 +85,9 @@ The *TimedExec* source codes are managed by [**Git**](http://git-scm.com/doc) an * https://bitbucket.org/muldersoft/timedexec.git ([Browse](https://bitbucket.org/muldersoft/timedexec)) * https://gitlab.com/timedexec/timedexec.git ([Browse](https://gitlab.com/timedexec/timedexec)) + License -------- +======= TimedExec is released under the terms of the [GNU General Public License](http://www.gnu.org/licenses/gpl-2.0.html), version 2. diff --git a/img/Style.inc b/img/Style.inc deleted file mode 100644 index 3fc74f5..0000000 --- a/img/Style.inc +++ /dev/null @@ -1,12 +0,0 @@ - diff --git a/img/timedexec/banner.jpg b/img/timedexec/banner.jpg new file mode 100644 index 0000000..d662653 Binary files /dev/null and b/img/timedexec/banner.jpg differ