[Userscript] Wanikani Heatmap

Indeed, all working now. Thank you for the super quick fix!

1 Like

Updated to 3.1.10, and for me the light/dark theme toggle doesn’t work anymore - it’s stuck on a permanently light background, but all other settings work :face_with_raised_eyebrow:
Any ideas what might interfere here?

2 Likes

Yep, I must have messed up the order of the variables when I made it so that Heatmap uses Wanikani’s CSS variables. I’ll get that fixed asap.

1 Like

Dark mode seems to be working fine for me - but if I try to switch to light mode that doesn’t work! So something to do with the switch.

Also - any chance you could quickly change the margin-top of the heatmap to --spacing-normal rather than --spacing-xloose while you’re fixing that please? The dashboard seems to have all switched to that now. I’ve also just updated BD2 because it was setting the space below SRS to xloose as well!

EDIT: Ah - my mistake, the narrower spacing is just BD2 lol :sweat_smile: I’m so used to it I thought it was --spacing-normal by default!

1 Like

I suppose there is going to be some minor change in theming functionality such that the “Breeze Dark” option is no longer applying a copy of some of the original Breeze Dark theme’s colors and instead should be thought of as a “compatibility toggle” where it uses the CSS variables that Breeze Dark 2 uses. Thus, if you’re someone using BD2, you should use the Breeze Dark option. I mistakenly also made the “Dark” option behave like this, essentially assuming that if someone were using it they were using a Dark theme userstyle that overrides the variables as well.

So, the solution will be that the “Light” and “Dark” options will return to how they were before, setting the colors explicitly (this means that “Dark” will not respect BD2 colors set with variables); however the Light theme does work with light theme userstyles that override the variables so I shouldn’t need to change anything there unless for some reason someone who uses a dark theme userstyle wants Heatmap in light theme. If you’re out there mysterious person, please let me know and I will fix Light Theme too.

Much more simple than that, I just did the variables in a way that had an implicit assumption:

  • if you choose the Dark option you’re using a Dark theme userstyle which overrides the variables

But that’s not the case, someone could just like how it looks within light theme or just as likely, uses the Dark Reader extension to make the site dark. So basically, because the variables are set by WK to light theme stuff, if they weren’t using a userstyle that overrides them it would stay light. This would also be true but in reverse if someone were using the Light option but did have a dark theme installed that overrides the variables.

1 Like

Should be fixed in 3.1.11

To all others reading this, here’s the rundown of the functionality of the theme option:

If you are NOT using a userstyle that overrides the WaniKani variables:

  • Light: will be the same as WaniKani’s default theming
  • Dark: will force dark background and light text, but keep WaniKani’s colors for items
  • Breeze Dark: will do one thing but will look the same as Light because it assumes you have Breeze Dark 2 installed. The “one thing” is change the item text color to a dark color. It does not support original, now defunct, Breeze Dark.

If you ARE using a userstyle that overrides the WaniKani variables:

  • Light: will use whatever the userstyle sets; this means if you use a Dark theme userstyle, it will be dark
  • Dark: will force its own dark background and light text color; will use the userstyle’s item colors if it overrides those but sets its own item text color to a light color
  • Breeze Dark: will behave the same as the “Light” option here with the exception of the item text color, that is, the “Light” option is more a “Default” than “Light” and this is default with dark item text color.
2 Likes

Oh - that makes sense then. I did just think - as there’s a specific BD2 setting, maybe the heatmap could reduce the margin-top to --spacing-normal when that’s selected to match what BD2 does? Just to try and avoid too much unnecessary bloat for scripts in the style. Not urgent though!

just used this and it helped me out big time - thank you so much!

1 Like

I just started doing Wanikani again after a few weeks away and my heatmap progress is lost (again) - did something happen or is it just me?

Open framework clears out the file cache if it hasn’t been used in a while. Always make a backup after you’ve done your reviews.

1 Like


I just logged on and noticed all of my progress was gone. Yesterday I had just hit my 100 day streak too. Does anyone know if there’s a way to recover my progress?

If you don’t have a backup of some kind, there isn’t, sorry.

Anyone else who uses this extension experiencing input lag during reviews? Toggling if it is enabled or not gives me reduces input lag I’m seeing of a second or two. I do have a lot of reviews (250,392) over quite a lot of days (1282) so I suspect that any inefficiency in an extension like this could be exacerbated by that.

1 Like

Hmm, it doesn’t quite make much sense for it to be this script because the script itself first checks the url of the page you’re on and in this case if it’s not the dashboard it doesn’t run anything that processes items.

The second part of Heatmap that does run on Reviews though, is Review Cache. However nothing has changed with that script (there was an update in the repo to it, but Heatmap has not had its version string updated for this new Review Cache version); thus it also shouldn’t be that library script causing issues.

However it’s possible that WK has changed something so that Review Cache now causes lag. That will need to be investigated but I’m not sure the best way to approach that.

I am having the same problem. My numbers are also on the large side, 1800 days, 170000 reviews. Turning the heatmap script off fixes the problem, but i’d prefer if it would work like it did before the page got that big update a couple of weeks/months ago

I looked into this a bit more for you with Chrome profiler open. When a review is fully completed (e.g. both reading and meaning have been entered correctly) then the function calculate_stats is being called. This seems to be because the review_cache subscription is firing, this triggers do_stuff which calls reload() which calls calculate_stats a couple of times to set up a stats object.

Unfortunately calculate_stats loops over every item in the data (every review and lesson taken). In my case this is now 250,000 loops every call. Most of the time spent in this loop is on the lines that call toDateString (around 3/4 of a second in my case). That itself is called ~5 times per loop so I’m looking at at least 1.25million toDateString calls per completed review :smiley:

My suggestions would be to see if you need to do a full do_stuff every time a review completes, or at least not do a full reload. If both are necessary is there a nicer way of running calculate_stats so it doesn’t loop over every single review each time its called, if not is there a way to minimize the toDateString calls. (I imagine that a lot of the calculated stats values could be cached or some of the internal logic memoized so you don’t need to run the whole thing every time).

Hope that helps debug if you choose to do so, thanks!

2 Likes

Now that I said all that I also guess you could just not run any logic for the heatmaps on the reviews page at all? I don’t think it needs to be listening to review cache subscriptions on those pages right? I may have missed how some of the internal logic is set up but it should only need to listen to reviews cache on the homepage if I’m not mistaken?

About how long have you been experiencing this issue? Just recently (within the past week or so) or has this been the case since the major updates a month ago? Has this been a problem since before the updates?

Thanks for the profiled investigation, I’m not sure I’m strong enough in programming concepts to fix an issue like this but the info should help still. I’ll open an issue for it in Kumi’s repo for tracking and discussion linking to this comment.

Because we cannot query the GetAllReviews API endpoint the only way to make sure your reviews are counted with Heatmap is to track reviews as you do them. Once the review session is finished the script has no way of knowing about your reviews.