Indeed, all working now. Thank you for the super quick fix!
Updated to 3.1.10, and for me the light/dark theme toggle doesnât work anymore - itâs stuck on a permanently light background, but all other settings work
Any ideas what might interfere here?
Yep, I must have messed up the order of the variables when I made it so that Heatmap uses Wanikaniâs CSS variables. Iâll get that fixed asap.
Dark mode seems to be working fine for me - but if I try to switch to light mode that doesnât work! So something to do with the switch.
Also - any chance you could quickly change the margin-top of the heatmap to --spacing-normal rather than --spacing-xloose while youâre fixing that please? The dashboard seems to have all switched to that now. Iâve also just updated BD2 because it was setting the space below SRS to xloose as well!
EDIT: Ah - my mistake, the narrower spacing is just BD2 lol Iâm so used to it I thought it was --spacing-normal by default!
I suppose there is going to be some minor change in theming functionality such that the âBreeze Darkâ option is no longer applying a copy of some of the original Breeze Dark themeâs colors and instead should be thought of as a âcompatibility toggleâ where it uses the CSS variables that Breeze Dark 2 uses. Thus, if youâre someone using BD2, you should use the Breeze Dark option. I mistakenly also made the âDarkâ option behave like this, essentially assuming that if someone were using it they were using a Dark theme userstyle that overrides the variables as well.
So, the solution will be that the âLightâ and âDarkâ options will return to how they were before, setting the colors explicitly (this means that âDarkâ will not respect BD2 colors set with variables); however the Light theme does work with light theme userstyles that override the variables so I shouldnât need to change anything there unless for some reason someone who uses a dark theme userstyle wants Heatmap in light theme. If youâre out there mysterious person, please let me know and I will fix Light Theme too.
Much more simple than that, I just did the variables in a way that had an implicit assumption:
- if you choose the Dark option youâre using a Dark theme userstyle which overrides the variables
But thatâs not the case, someone could just like how it looks within light theme or just as likely, uses the Dark Reader extension to make the site dark. So basically, because the variables are set by WK to light theme stuff, if they werenât using a userstyle that overrides them it would stay light. This would also be true but in reverse if someone were using the Light option but did have a dark theme installed that overrides the variables.
Should be fixed in 3.1.11
To all others reading this, hereâs the rundown of the functionality of the theme option:
If you are NOT using a userstyle that overrides the WaniKani variables:
- Light: will be the same as WaniKaniâs default theming
- Dark: will force dark background and light text, but keep WaniKaniâs colors for items
- Breeze Dark: will do one thing but will look the same as Light because it assumes you have Breeze Dark 2 installed. The âone thingâ is change the item text color to a dark color. It does not support original, now defunct, Breeze Dark.
If you ARE using a userstyle that overrides the WaniKani variables:
- Light: will use whatever the userstyle sets; this means if you use a Dark theme userstyle, it will be dark
- Dark: will force its own dark background and light text color; will use the userstyleâs item colors if it overrides those but sets its own item text color to a light color
- Breeze Dark: will behave the same as the âLightâ option here with the exception of the item text color, that is, the âLightâ option is more a âDefaultâ than âLightâ and this is default with dark item text color.
Oh - that makes sense then. I did just think - as thereâs a specific BD2 setting, maybe the heatmap could reduce the margin-top to --spacing-normal when thatâs selected to match what BD2 does? Just to try and avoid too much unnecessary bloat for scripts in the style. Not urgent though!
just used this and it helped me out big time - thank you so much!
I just started doing Wanikani again after a few weeks away and my heatmap progress is lost (again) - did something happen or is it just me?
Open framework clears out the file cache if it hasnât been used in a while. Always make a backup after youâve done your reviews.
I just logged on and noticed all of my progress was gone. Yesterday I had just hit my 100 day streak too. Does anyone know if thereâs a way to recover my progress?
If you donât have a backup of some kind, there isnât, sorry.
Anyone else who uses this extension experiencing input lag during reviews? Toggling if it is enabled or not gives me reduces input lag Iâm seeing of a second or two. I do have a lot of reviews (250,392) over quite a lot of days (1282) so I suspect that any inefficiency in an extension like this could be exacerbated by that.
Hmm, it doesnât quite make much sense for it to be this script because the script itself first checks the url of the page youâre on and in this case if itâs not the dashboard it doesnât run anything that processes items.
The second part of Heatmap that does run on Reviews though, is Review Cache. However nothing has changed with that script (there was an update in the repo to it, but Heatmap has not had its version string updated for this new Review Cache version); thus it also shouldnât be that library script causing issues.
However itâs possible that WK has changed something so that Review Cache now causes lag. That will need to be investigated but Iâm not sure the best way to approach that.
I am having the same problem. My numbers are also on the large side, 1800 days, 170000 reviews. Turning the heatmap script off fixes the problem, but iâd prefer if it would work like it did before the page got that big update a couple of weeks/months ago
I looked into this a bit more for you with Chrome profiler open. When a review is fully completed (e.g. both reading and meaning have been entered correctly) then the function calculate_stats
is being called. This seems to be because the review_cache subscription is firing, this triggers do_stuff
which calls reload()
which calls calculate_stats
a couple of times to set up a stats object.
Unfortunately calculate_stats
loops over every item in the data (every review and lesson taken). In my case this is now 250,000 loops every call. Most of the time spent in this loop is on the lines that call toDateString
(around 3/4 of a second in my case). That itself is called ~5 times per loop so Iâm looking at at least 1.25million toDateString
calls per completed review
My suggestions would be to see if you need to do a full do_stuff
every time a review completes, or at least not do a full reload
. If both are necessary is there a nicer way of running calculate_stats
so it doesnât loop over every single review each time its called, if not is there a way to minimize the toDateString
calls. (I imagine that a lot of the calculated stats values could be cached or some of the internal logic memoized so you donât need to run the whole thing every time).
Hope that helps debug if you choose to do so, thanks!
Now that I said all that I also guess you could just not run any logic for the heatmaps on the reviews page at all? I donât think it needs to be listening to review cache subscriptions on those pages right? I may have missed how some of the internal logic is set up but it should only need to listen to reviews cache on the homepage if Iâm not mistaken?
About how long have you been experiencing this issue? Just recently (within the past week or so) or has this been the case since the major updates a month ago? Has this been a problem since before the updates?
Thanks for the profiled investigation, Iâm not sure Iâm strong enough in programming concepts to fix an issue like this but the info should help still. Iâll open an issue for it in Kumiâs repo for tracking and discussion linking to this comment.
Because we cannot query the GetAllReviews API endpoint the only way to make sure your reviews are counted with Heatmap is to track reviews as you do them. Once the review session is finished the script has no way of knowing about your reviews.