There’s still the question of whether people would even want to include items that were wrong yesterday but correct today (if N included both days). I don’t think it’s clear what the more common use case would be.
For the reviews endpoint, does it literally return an entry per review done by the user? So can you get more than one result from the API call per item?
Yes. That’s why I didn’t include that in the wkof ItemData. It can be a pretty big number.
I suppose I could add it into ItemData, but require an N parameter so it only caches the last N days. Each item would then have the an array under item.reviews
Still would be a lot of data if they use a ridiculously large N, but it might be useful to add anyway.
Since I’ve been level 60 since before APIv2, I barely have any /reviews data. Would you mind fetching the whole endpoint on your account to see how big it is?
var items;
wkof.Apiv2.fetch_endpoint('reviews').then(function(data){
items = data;
var str = JSON.stringify(items);
console.log('Data is '+str.length+' bytes');
});
It would also help to know what your oldest datapoint is, but I have a rough idea already.
I’ll take a look tonight.
1 Like
16:21:26.818 var items;
wkof.Apiv2.fetch_endpoint('reviews').then(function(data){
items = data;
var str = JSON.stringify(items);
console.log('Data is '+str.length+' bytes');
});
16:21:26.828 Promise {<pending>}
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.096 XHR finished loading: OPTIONS "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:30.744 XHR finished loading: GET "<URL>".
16:21:52.689 Data is 14434898 bytes
14,434898 Megabyte
2 Likes
Data is 12,381,397 bytes.
I had 30,217 entries, for an average of 409.75 bytes per entry.
Here is the first result from the API request:
{"id":8130,"object":"review","url":"https://api.wanikani.com/v2/reviews/8130","data_updated_at":"2017-08-04T14:57:00.943988Z","data":{"created_at":"2017-08-04T14:57:00.943988Z","assignment_id":76837566,"subject_id":4889,"starting_srs_stage":4,"starting_srs_stage_name":"Apprentice IV","ending_srs_stage":5,"ending_srs_stage_name":"Guru I","incorrect_meaning_answers":0,"incorrect_reading_answers":0}}
The first entry is from 2017-08-04. My average level up time has been 14-15 days since shortly before that date, so I think you can assume that a near-max speed person (with a similar error rate) would have roughly twice as much data as me right now. Though obviously that will only continue to go up as users who started after that date get to high levels.
What is the maximum storage size of the database you’re using?
The db is 50MB on most browsers.
I don’t want to push the boundaries, so I think there are two options:
- Put a cap on the number of days it can cache
- Compress the data.
I can compress the data about 85%, but I don’t know how long the compression/decompression will take on a large dataset without some tests.
You could try a combination as well if you’re concerned about the performance of the compression/decompression. That way if people just want recent data they don’t have to pay a penalty, but people could get a lot more data if they want.
Do you think a recently quizzed filter would be useful?
If you have it active you would not get the item in the quiz for x hours?
I personally wouldn’t use that. Either way, I don’t want my script to maintain any state (other than settings), so I don’t think I’ll add that.
How about a max limit filter.
Lets say I want to review 100 random kanjis?
I don’t think I’d be able to randomize the items returned and guarantee you’d get the number of items you wanted. If you’re getting too many items back you should probably add additional filter criteria.
I thought they are already randomized 
It was just a suggestion. If I really want something for myself I would build it 
The items are presented in the quiz in a randomized way, but that doesn’t mean they are randomized before they are filtered. In fact they are likely randomized after they are filtered for performance reasons.
Keep in mind that I only have control over the filters. All I can specify is the condition which decides whether or not to include an item. Everything else is decided by the WaniKani Open Framework and Self Study scripts.
1 Like
As seanblue said, filters don’t have access to the full set of items, so limiting quiz size via filter isn’t possible. But it turns out to be really easy to add to the quiz itself.
1 Like
Just released version 1.2.0 which includes a filter for Failed Last Review.
3 Likes
Just released version 1.3.0 which includes a filter for Related Items. Details are in the main post.
That’s it for all the ideas I had. If anyone else has any ideas, let me know.
Those filters are exactly what I was wishing for! Thanks a lot. I hope I manage to get them running…
1 Like
If you run into any trouble, let me know!