[Userscript] The GanbarOmeter

That’ll teach me to post a script just before going to bed. [Edit: not to mention replying before I’ve had my first coffee.]

I forgot that the first several people to try out a new script would be level 60s. I think this script is most useful for people between roughly levels 8 and 60, but I’m hopeful that it will eventually be useful for everyone.

Yup, that was my intended behavior which I intended to document in the first post (but cleverly forgot). I will edit the intro to mention this.

I’ll look into @rwesterhof’s suggestion of using a callback to refresh without going into an infinite loop. Currently, I load and update the settings on every refresh.

As described elsewhere, I think this is likely due to the Session interval value being too high (defaults to 10 minutes). Please set it to something like 2.5 minutes and let me know the result.

Also, could you enable debug then refresh and show me what’s printed in your JavaScript console? [Right-click, “inspect”, then click “console” in the developer window that appears.]

I first go through all the recent reviews looking for “sessions” (reviews done one after another without too much time in between each review). Each session tracks how many minutes elapsed during the session.

The reported speed is the sum of all minutes from all sessions divided by 60, divided by the total number of reviews performed across all sessions.

The default maximum between reviews is 10 minute. A lower value might be wiser (say somewhere between 2.5 and 5 minutes). I will almost certainly lower the default in a future version. (Since the default maximum speed is 30 seconds, I’m currently thinking a value of 5X to 10X the maxSpeed might be a reasonable default.)

I think I just corrected a bug in the logic, though: in v0.1, if a session only had a single review, the minutes() for that session would report as zero. This is because the API only reports the start time of a review, not how long it took to submit an answer. So start and end time of a session with only one review will be the same.

I v0.2 and later, I use maxSpeed/2 as an estimate for very short sessions (which I think can only happen with one review).

It appears both scripts agree on the number of reviews, so the difference must be in the number of seconds it took to perform those reviews. Heatmap is using a value of 14,940 seconds as you point out. GanbarOmeter must be using a value of around 84,630 seconds (23.5 hours of review time over 3 days, which makes no sense).

I’d really like to see your console output with debug enabled: The minutes() value returned for one of the sessions must be way off for some reason.

Also: 417 reviews/day over three days!! Yow. I’d be whimpering for sure. :grin:

The defaults are definitely aimed at slow-and-steady users like myself. Speed-runners will likely want to bump up some of the values. In your case, you may want to set:

  • Desired apprentice quantity to 300 or higher so the gauge doesn’t peg. You’ve currently got 296 apprentice items. The goal is to have the gauge show roughly 50% (needle straight up) for the desired difficulty level. With the defaults, someone with exactly 100 apprentice items, answering fewer than 20% of reviews incorrectly, and with no kanji in Apprentice1 or Apprentice2 would cause the gauge to display exactly 50%. With 296 apprentice items, you’re already pegging the meter without accounting for misses and new kanji.

  • Maximum reviews per day to 800 or so so. With the defaults, the gauge will show 50% at 150 reviews/day.