A better way to review patterns of use and context sentences?

I want to go back and review patterns of use and context sentences for all my vocab words. Is there a faster way to properly review these?

Just going through each seperate vocab page is kinda annoying and clunky. The normal “https://www.wanikani.com/vocabulary/WORD” is slow when trying to review many words.

Does anyone have any recommendations or recommend any user scripts?

Is there a way to grab the context/patterns en masse?

Any help greatly appreciated!

It is quite simple to do with some basic development experience using the wk api.

Maybe you can even ask chatgpt to write you a simple script if you feed the api docs to it.

is there a basic guide on how to do stuff like that, outside of gpt?

I do cpp but have never done anything web based and don’t know where to begin

If you’re familiar with programming already, then it shouldn’t be difficult. Search for libraries to work with web requests and json for C++ and here’s the wk api docs that have examples.

As far as I remember you need subjects: WaniKani API Reference