Spaced_repetition_system_id backfill


I have written a script that goes through my wanikani reviews and then updates beeminder with them. In the past few days however, it blew up to over a thousand reviews per day (the api page limit is 1000). Generally I filter by the updated_after parameter giving it the last 24 hours, but it looks like these were actually updated in the right time frame, they were just created much earlier. Example:

      "id": 858658477,
      "object": "review",
      "url": "",
      "data_updated_at": "2020-06-15T12:28:35.253247Z",
      "data": {
        "created_at": "2019-11-15T16:22:02.287085Z",
        "assignment_id": 142746075,
        "subject_id": 3783,
        "spaced_repetition_system_id": 1,
        "starting_srs_stage": 4,
        "starting_srs_stage_name": "Apprentice IV",
        "ending_srs_stage": 5,
        "ending_srs_stage_name": "Guru I",
        "incorrect_meaning_answers": 0,
        "incorrect_reading_answers": 0

The field spaced_repetition_system_id isn’t present in the api docs, so I’m guessing that’s what is being backfilled.

Does anyone know more about this? I’m going to have to change my script to filter out backfills client side, since there’s no created_after endpoint parameter to filter them out with

I fell victim to this too with the Heatmap script. Viet said that they will make an announcement in a few days. You should be able to rely on the created_at timestamp to filter them out locally.

There is some more info in the pipeline 295787090669469698


“no blowfish allowed”…! Poor Fugu :tofugu:


Is there any perceived advantage of backfilling the passed_at dates? Neither the Heatmap script nor the Timemachine showed any changes.

Filling them in would change the Timemachine for some old items, since not all items have passed_at timestamps. Other than that I can’t think of any application that would be affected. I’m sure it’s mostly for internal purposes

1 Like