They should call them Fuzzing models. They're just running through varioous iterations of the context until they hit a token that trips them out.