I've got a problem (yes, Samik, but it's ok, we love you anyway) slash question.
I've recently joined a gym and taken my running indoors. Once thing I've noticed about my gym's treadmills is that, despite the nearly 20 seconds it takes for the treadmill to run up to the full speed I'm asking of it at the beginning of my runs (presumably costing me approximately 10 seconds on my times), the times it gives me at the end of my runs are always exactly what they should be if there was no run up at all.
For example, today I did:
(1.5mi @8.8mph) + (0.25mi @9.3mph) + (0.25 @10.1mph)
Assuming zero time for acceleration (and that I don't suck at math) this is:
(90/8.8 )+(15/9.3)+(15/10.1) = ~13.325, or ~13m19.5s
Lo and behold, as my run ticked over to 2 miles, and I pressed stopped, what did the time read? 13:20 on the dot.
Originally, I was assuming that the treadmill was just running at a tiny fraction of a mile an hour faster than indicated, to make up that lost time at the beginning. But then I realized that isn't realistic.
For example, today I looked down at 1.03 miles to see a time of 7:03 on the clock. I.E. approximately a pace of 6:51 per mile (give or take a couple seconds, depending on where I was in that hundredth of a mile). Well, since I did that first 1.03 miles at 8.8mph, or a pace of 6:49/mi, that means the treadmill would have had to have made up the entire difference within the first mile (at the very most).
To do that, it would have had to have been running at at least 9mph, instead of 8.8. I find it pretty unlikely that the machine would operate at such a significantly different speed than requested over any really significant distance at all. (Not to mention that, if this was really what was going on, I'd certainly feel a speed change of that magnitude once it made up the time and settled back down to the pace you told it to go.)
The other possibility is that it is just fudging the numbers to give you the benefit of the doubt. To gain those ten seconds back within a mile, all the machine would have to do is tell you you had run 1.000 miles when you'd really only run 0.975 miles. Chopping off two and a half hundredths of a mile would hardly be noticeable, I assume.
At the end of the day it hardly matters - if having all my times be 10 seconds slower than I thought really bothered me, I could just run 10 seconds longer at the end of my run, when it's already up to speed, and consider my (for example) 2 mile run to be 0.025 to 2.025, instead of 0 to 2.
I'm just personally curious if anyone knows how treadmills are usually programmed to deal with this sort of thing. Are there other possibilities I'm not considering? Either way, there is, beyond doubt, some sort of chicanery going on within that thing's innards.
Plus, I'm pretty sure that most other treadmills I've ever run on just make you eat the lost time from the beginning (to hit a given objective, I've always had to set the mill for a tenth of a mile an hour or two faster than should be necessary).