[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[lmi-commits] [lmi] master 1a629bf: Report unit-test timings as minimum,
From: |
Greg Chicares |
Subject: |
[lmi-commits] [lmi] master 1a629bf: Report unit-test timings as minimum, not mean |
Date: |
Thu, 11 May 2017 10:57:58 -0400 (EDT) |
branch: master
commit 1a629bf470d3a880480edb94a955a3ae32c157eb
Author: Gregory W. Chicares <address@hidden>
Commit: Gregory W. Chicares <address@hidden>
Report unit-test timings as minimum, not mean
When the run time of a function is measured by calling it repeatedly,
the minimum is a better estimate than the mean or mode, as discussed
in the mailing-list message cited in comments as well as
http://lists.nongnu.org/archive/html/lmi/2017-05/msg00012.html
Both minimum and mean are displayed, but only the minimum is returned.
With a microsecond timer, it didn't make sense to report nanoseconds.
Removed an assertion that a particular unit test takes more than zero
seconds per iteration, because it may take less than one microsecond.
---
timer.hpp | 30 ++++++++++++++++++++----------
vector_test.cpp | 1 -
2 files changed, 20 insertions(+), 11 deletions(-)
diff --git a/timer.hpp b/timer.hpp
index 4f3a6fa..3f36505 100644
--- a/timer.hpp
+++ b/timer.hpp
@@ -107,7 +107,9 @@ class LMI_SO Timer
/// Otherwise, discard the first observation (which is often anomalous
/// due, e.g., to cache effects); execute the operation repeatedly,
/// until the stopping criterion is satisfied; and then record the
-/// mean measured time. The stopping criterion is that both:
+/// minimum measured time, which is more useful than the mean--see:
+/// http://lists.nongnu.org/archive/html/lmi/2017-05/msg00005.html
+/// The stopping criterion is that both:
/// - at least one percent of the allotted time has been spent; and
/// - either all allotted time has been used, or the repetition count
/// has reached one hundred.
@@ -202,31 +204,39 @@ AliquotTimer<F>& AliquotTimer<F>::operator()()
}
Timer timer;
- double const limit = max_seconds_ * static_cast<double>(timer.frequency_);
+ double const dbl_freq = static_cast<double>(timer.frequency_);
+ double const limit = max_seconds_ * dbl_freq;
double const start_time = static_cast<double>(timer.time_when_started_);
double const expiry_min = start_time + 0.01 * limit;
double const expiry_max = start_time + limit;
+ elapsed_t minimum = limit;
int j = 0;
for
- (elapsed_t now = 0
+ (elapsed_t now = start_time, previous = start_time
;now < expiry_min || j < 100 && now < expiry_max
- ;++j, now = timer.inspect()
+ ;++j
)
{
f_();
+ previous = now;
+ now = timer.inspect();
+ if(now - previous < minimum)
+ {
+ minimum = now - previous;
+ }
}
timer.stop();
- unit_time_ = timer.elapsed_seconds() / j;
+ unit_time_ = minimum / dbl_freq;
std::ostringstream oss;
oss
<< std::scientific << std::setprecision(3)
- << unit_time_
- << " s = "
+ << timer.elapsed_seconds() / j
+ << " s mean; "
<< std::fixed << std::setprecision(0)
- << std::setw(10) << 1.0e9 * unit_time_
- << " ns, mean of "
+ << std::setw(10) << 1.0e6 * unit_time_
+ << " us least of "
<< std::setw( 3) << j
- << " iterations"
+ << " runs"
;
str_ = oss.str();
diff --git a/vector_test.cpp b/vector_test.cpp
index cc0ddcd..b2a7a17 100644
--- a/vector_test.cpp
+++ b/vector_test.cpp
@@ -329,7 +329,6 @@ void time_one_array_length(int length)
BOOST_TEST_EQUAL(g_w [n], 2.0 * n);
double const va = TimeAnAliquot(mete_va, max_seconds).unit_time();
BOOST_TEST_EQUAL(g_va_w[n], 2.0 * n);
- LMI_ASSERT(0.0 != c);
std::cout
<< std::setw( 7) << g_array_length
<< std::setw(15) << std::setprecision(3) << std::scientific << c
[Prev in Thread] |
Current Thread |
[Next in Thread] |
- [lmi-commits] [lmi] master 1a629bf: Report unit-test timings as minimum, not mean,
Greg Chicares <=