Perfect, thanks for your help! I successfully made a quick python script that takes in the CSV file of my full review history (from a GDPR data request to WK), converts it to the right format, and combines it with any more recent reviews exported from heatmap, removing duplicates. Finally got my full history back to 2021 restored ![]()
oh, i did not know that they had disabled all access to past reviews data when I was backing up stuff for an OS reinstall ;-;
(hope they fix it, why not just rate limit, even once a month would be enough for me -o-)
The last screenshot I have is from july, but I have 2 screenshots that together cover all the time from the start in 2017 up to then, so at least I can harvest that data from those screenshots, I would still like to get the entire data set back some day but approximate reviews completed on each day is mainly what Iām sad about losing
Do you mind sharing the script?
Hello! So Iām trying to move computers today, and the instructions for exporting my heatmap progress arenāt working? I copied the string correctly, and followed the steps I did last time (which did work then) but⦠I get an error that review_cache is not defined?
Am I missing something obvious? Admittedly I am using a quite old version of firefox (102.14.0esr), but it was probably the same version when I did this earlier in the year.
Here it is: WK Review Data Merger - Pastebin.com
Itās a bit of work but Iām really happy having got all my reviews back and filled in gaps done on other devices!
Put the code in a python file (e.g. code.py), and make sure itās in the same folder as the other files youāll be using.
To use, you need your CSV of exported reviews (reviews.csv) as well as a past_data.json file containing the exported reviews you currently have in the heatmap (if thereās more recent ones than in the WK data export, otherwise create the json file and just put [] in it).
You can export the current heatmap items by going to the dashboard, opening the console, and pasting in JSON.stringify(await review_cache.get_reviews()), and you can then copy that (make sure you use the copy button bottom right of it in the console to ensure you copy all of it) into the json file, making sure to remove the quotes ' at the start and end.
You can then run the python file and it should create a file called output.txt.
Go back to the WK Dashboard tab where you have the console open and first delete the existing reviews with wkof.file_cache.delete('review_cache'), then insert the new ones with review_cache.insert(XXX) where XXX is the contents of the output.txt file.
If when running the python code you get an error about unix time conversion, open the reviews csv and make sure the created_at column is in format dd/mm/yyyy hh:mm:ss (spreadsheet programs should let you change this if not).
Hope this helps!
Okay, so I figured out what I was doing wrong - I was on the wrong page. Now I have a different issue - the console seems to⦠crash? When I try to select the data, it freezes for a moment, and then the info poofs into the abyss before I can grab it. ![]()
Hi, Iām using two computers with Wanikani and I want to automatically sync them together. Is that possible, or do I just have to manually add it each day?
Not quite an automatic sync but have you tried resetting the browser on the out of sync Wanikani page?
Reset? Do you mean just ending the chrome task and opening it again?
It is a typo. I meant refresh, like in clicking the refresh button in the browser. Restarting the browser is an overkill. Now that I think of it refreshing the browser will not work for syncing the heatmap.
Yeah, that doesnāt work. Is there some way to do it faster manually, like exporting the data or something?
I think someone posted a procedure somewhere up in this thread involving exporting and importing JSON data. You may check that out.
Edit: found it at this link
Will it merge the data? Because I want the few days on this computer to not be overwritten by the other one.
I think it will copy and overwrite the whole data and not just the update. But be warned that I donāt know much so perhaps you should try to gat a second opinionā¦
Wow, I just scrolled up a bit and saw that so many other people had the same question. @Hubbit200 made a data merging script, but he uses an exported csv file, and I donāt know where he gets that. Perhaps he could help?
Perhaps he could help. You should tag him or respond to his post.
The bit of code I wrote is intended to merge data exported from heatmap with a full copy of review history exported by WK into a CSV - it unfortunately wonāt merge two different heatmap exports. I wonāt have time until February, but I might take a look then to see if I can edit it to get that to work as well ![]()
As for the CSV - I very occasionally ask Wanikani for it (once a year or so) - as Iām in the EU by law they have to provide any data they hold about me, which includes a history of past reviews. If youāre not in the EU you could try and see if theyāre in a good mood and will give it to you anyway
But it seems like past history probably is less of an issue for you so far as youāre only on level 4 - like you say just exporting heatmaps and merging them would be the easiest!
Thatās true, itās best if I just merge these. I donāt mind to wait until February, so please take your time. I just hoped to kill this issue before it becomes worse.
I had a bit of a look and it suggests
JSON.stringify(await review_cache.get_reviews())
to get the reviews from one browser and
review_cache.insert(JSON.parse(<string>))
to insert it into another browser.
The Review Cache userscript ressource defines the insert function as
async function insert(reviews) {
const cached = await load_data()
const newestDate = reviews.reduce((max, cur) => Math.max(cur[0], max), 0)
const updated = {
cache_version,
date: new Date(newestDate).toISOString(),
reviews: cached.reviews.concat(reviews).sort((a, b) => a[0] - b[0]),
}
for (let subscriber of window.review_cache._subscribers) subscriber?.(updated.reviews)
await save(updated)
}
As I can see it, with cached=load_data() and cached.reviews.concat(reviews) the stored data should be preserved and the new data merged into it.
However, I donāt know how review_cache works in detail, so double-entries might cause the second one to be ignored.