Oh dear, I see that I ninja'd @Link
on the AUTOMOME developments. Great minds code alike
and suffice it to say, we've both been encouraged by each other's developments.
I have finished implementing a REDUNDANT version
of AUTOMOME. Like the original by @Link
, my version is a nice friendly webpage with a Cuegan on the top that you can click, and links at the bottom for text-only output
. Unlike @Link
's version, mine is written in Perl
and vocabulary and snowclone templates are all in a single datafile
. To remind you that you're on a different AUTOMOME, the Cuegan are different.
Other AUTOMOME changes:
- Added AMTOO, BEANETTE, ROSETTA; and doubled the number of CUEGAN names to make them happen more often
- Changed SH*T to M*STARD in one place
- "THIS IS PHOTOSHOPPED" snowclone now substitutes other perfect-tense verbs for PHOTOSHOPPED (previously only adjectives were used, and the original AUTOMEME is that way too, but I see no reason why)
karhell wrote: So out of curiosity and maybe a teeny bit of procrastination >.>, I investigated the previously reported error ...
Great detective work @karhell
If you look in the source code of the script, you'll find a large block comment right above the bananascanner
function, showing examples of the HTML. However, the avatar-less example isn't there yet, but I'll add one.
There are similar block comments in no-tinytext.user.js and spoiler-opener.user.js which I added while figuring out the format of [size} and [spoiler} tags, respectively.
macraw83 wrote: How is that "posts/hr" column calculated?
It is explained in the introductory paragraphs of that page. In particular I call your attention to the parts that say, All statistics are 'decaying averages' with a 24-hour half-life.
and the stats are only recomputed each hour, so they repeat for two or more lines when there is more than one Newpage per hour.
By "decaying average" I actually mean exponential smoothing
with a smoothing factor chosen such that the half-life is 24 hours: Each datum's weight decreases by a factor of 2 every 24 newpix.
The average is recalculated once per hour:
st = αR + (1-α)st-1
is the post rate for the last Newpage. To achieve a 24-hour half-life, the value α = 1/34 ≈ ln(2)/24 is used.
Since the average is updated once per hour, and Newpages sometimes take more than one hour to complete, the average is often recomputed more than once per Newpage. Nevertheless, the average is only printed to ott-rate-stats.txt
once per Newpage.
NP1254-NP1258 (5 distinct NP) happened on 20130726.16:xx:xx, meaning that at least 200 posts happened within a "1 hour" period, but the posts/hr column never even hit 40 until NP1274 (8 oldpix later)
For those infamous Newpages, the postrate R
was, as you point out, about 200. However, during that hour the formula above was computed only once, because only one hour elapsed. That calculation occurred at Newpix 3090 which is 20130726.17:00 in the US Eastern time-zone, or 20130726.21:00 UTC. If the rate were actually 200, then the calculation would have been:
st = 200/34 + (33/34)*26.19 ≈ 31.3
But actually the number we see is 32.29, a bit higher but close enough. This calculation occurred on Newpage 1259, but because of the complex way I parse the OTT and output the lines to this and the other reports, it doesn't appear in ott-rate-stats.txt until the line for Newpage 1260.
If you want "instantaneous" rate stats, or a different half-life, I suppose I could do something. I'm a bit reluctant to add lots of columns but one or two would be okay. I could recompute the decaying average once per minute (the highest resolution possible) to make shorter-halflife stats more useful.
Edits: Acknowledge @Link
's developments, and the typical typo fixes