[Userscript] Review Summary on the dashboard

Love the add-on. Personally, I wouldn’t mind if the little boxes (that open if you click one of the items in the review summary) also showed pitch information as in [Userscript] WaniKani Pitch Info .

1 Like

If you already have Pitch Info available, you can click on the item itself in the boxes to go to that item’s page (opens in a new window). But I guess that doesn’t diminish the usefulness of added info in the popup. I can put it on a pending items list for now - this script was not supposed to live even as long as it has.

Am I the only one that’s getting a blank blue box under where it says Review Summary?

1 Like

@rwesterhof Is this broken atm?

1 Like

sorry for the late response. It’s still working on my dashboard - can you post a screenshot and/or check if there’s an error in the console? (hit F12 to reveal the console)

1 Like

1 Like

Thank you so much! I introduced a situation specific nullpointer error in 0.9.1 that I don’t run into myself. Fixed in 0.9.2

1 Like

I’d love to include a way to show time/duration of the last review session, a reviews/minute, and a correct reviews/minute. This is my first time looking at the API but unfortunately I don’t see any data there that records the time. And console logging the cached data you used for this doesn’t seem to have it either.

My only idea so far would be to use the cached.itemBreakdown.filteredReviews[0][0] which I am assuming is the ID of the first review you did last session and use that in the “GET https:// api.wanikani. com/v2/review_statistics/” endpoint which has a data_updated_at value. Then get the last item in the filteredReviews array and do the same, then find the difference between the times. Thoughts?

Thank you!

To me, this is by far my most important and needed userscript! :smiley:

1 Like

Since the update earlier this year WK no longer knows the concept of a ‘review session’, which means there is no logical grouping of reviews into a session anymore. The time of the review still gives an indication (I don’t cache it, but the review cache script I rely on does!), but it’s not perfect. Two things to consider:

  1. a completed review is stored once both answers (for kanji and non-kana vocab) are correctly given. That means the review ’ session’ will have started before the first completed review. Say a user does 10 reviews as a ’ session’. The completion of the first review could be the user’s 2nd answer (when using a script like back to back), or it could be the user’s 11th answer (10 readings first, then the first of 10 meanings). So last review timestamp minus the first review timestamp could be the time for 18/20 answers, or it could be the time for 9/20 answers. Anywhere from 45%-90% of the review time (assuming constant rate of answering and giving only correct answers!).
  2. Some users will do a quick session with a few reviews, close the browser to do other things and come back 10 minutes later to do ‘another session’. Other users leave the browser open during a session to do others things and return to ‘their session’ 10 minutes later. So is the gap of 10 minutes between completed reviews a session break or isn’t it?

You can take a look at other scripts to see how they handle these issues
WaniKani Review Clock displays similar stats to what you’re looking at, but only during the reviews.
Ganbaro Meter displayed similar stats on the dashboard (which I think is what you’re looking for), but it uses the Get Reviews API which no longer returns any data and as such it is broken as a script. Probably still interesting to see how the script handled the ‘what is a session’ question.
The Heatmap also makes estimates on how much time a user spent doing reviews and counts the ‘number of review sessions’ a user has done (which is also an estimate - and I think based on configuration, so that a user can determine for themselves how long of an inactivity counts as a separate session).

To get more reliable numbers, I would probably look at a script that runs both during the reviews themselves (as a timer) and on the dashboard (to display results), but you’d need to account for typical ‘the user can do whatever’ situations like spontaneous browser closes.
This Review summary script mostly makes just 1 assumption - if you load the dashboard, your ‘session’ is over, and it displays everything since your last load of the dashboard. It does not work well when you have multiple browser windows open (reviews and dashboard!), and the only thing I added for that was a disclaimer (that it doesn’t work, lol).


I used the statistical technique called Median Absolute Deviation to determine sessions in my Ganbarometer script (still broken, sadly, dunno if I’ll ever be able to get back to it).

It’s pretty straightforward and worked quite well in practice. It does require one magic number (I can’t remember if I stuck with 2.0, but the source is on github – I’m about to board a plane and don’t have time to check).

Great article about the technique here: Data science: Use median absolute deviation instead of z-score to detect outliers

If it’s not clear: The way I used it is to first create a list of interval times (for each day or whatever). Each interval is the time from the start of one review record to the next review’s start.

Then I used MAD to find the outliers that were too long and marked them as the start of a new session.

Worked very well for me.


Thanks for the info! That WaniKani Review Clock is exactly what I had in mind, only on the dashboard so you can see if after you finish your review. I’ll have to look at the code for that one and see if there is a way to either cache or store that data so it can be displayed later.


Layout is broken again. If you don’t get to it or can’t repro, I’ll take a closer look later today.


Drat! When I fixed the cockpit this morning I thought I checked all my other scripts for the panel renaming as well. Missed it. 0.9.3 now available. Thanks for reporting!

1 Like


Layout’s broken for me too.

However, I can still click the little arrow and get the percentage.

1 Like

Can you check if it updated to 0.9.3 ? Sometimes this takes a while (but you can manually trigger it aswell). The seen behaviour should only appear if it can’t find the review forecast panel (so it can put the blue tile above it).

Looking good here! Thanks as always for the quick turnaround!

1 Like


Yes, it’s fixed now. I have version 0.9.3.

Thank you.

1 Like