How does this site tokenize text? Split on ASCII whitespace?
Inputting Japanese sentences of any length flags the whole sentence as "Dramatic Fragment: A standalone paragraph with ≤4 words".