Big drop off in correct percentage on reviews

I’m sure I’ve mentioned this before or it’s been mentioned before…

I just experienced a HUGE drop off in correct percentage on reviews.

When I finished my review session, before the “tabulation”, it said I had 85 percent correct.

Why, then, was this the result?

Surely it’s not THIS inaccurate?

Just to note… I don’t have tampermonkey set up on IE yet and maybe that’s the reason? Should I really need a script just to make it tabulate correctly?

1 Like

I think the percentage correct shown when you’re doing the reviews counts both the meaning and the reading. As a simple example, consider a review session with a single kanji. Let’s say you get the meaning right but miss the reading. That’d show 50% in the review. The summary page, however, shows it as an incorrect answer if you got either the meaning or the reading wrong. In the simple example, you missed the reading, so the whole thing is put under the “Answered Incorrectly” header: the accuracy would show up as 0%.


Assuming you get it right the second time, it would say 67% correct before you proceed to the summary page. 2/3 cards correct.


That’s helpful. The session accuracy considers all of the cards you’ve seen; the summary page considers only the associated radicals, kanji, and vocabulary.

In that case, if you get a question wrong, the highest accuracy you can achieve for that entity (a radical, kanji, or vocabulary word) is 67% (if you get two of three cards correct; that is, if you miss one part of a kanji or vocabulary word but get it the second time). If the review summary page shows an accuracy a=0.65, the maximum possible session accuracy is 1-\frac{1}{3}(1-a) \approx 0.88, which is consistent with the observation in the original post.

1 Like

Just a note, the one with the checkpoint got it before you. However, just on virtue of math alone, you should get one too XD

Thanks ALL for your help.