▲ | naasking 13 hours ago | |||||||
2GB in a single JSON file is definitely an outlier. A simple caveat when using this header could suffice: ensure inputs are less than 2GB. | ||||||||
▲ | layer8 13 hours ago | parent | next [-] | |||||||
Less than INT_MAX, more accurately. But since the library contains a check when decreasing the counter, it might as well have a check when increasing the counter (and line/column numbers). | ||||||||
▲ | jeroenhd 12 hours ago | parent | prev | next [-] | |||||||
I've seen much bigger, though technically that wasn't valid json, but rather structured logging with JSON on each line. On the other hand, I've seen exported JSON files that could grow to such sizes without doing anything weird, just nothing exceeding a couple hundred megabytes because I didn't use the software for long enough. Restricting the input to a reasonable size is an easy workaround for sure, but this limitation isn't indicated everywhere, so anyone deciding to consume this random project into their important code wouldn't know to defend against such situation. In a web server scenario, 2GiB of { (which would trigger two overflows) in a compressed request would require a couple hundred kilobytes to two megabytes, depending on how old your server software is. | ||||||||
| ||||||||
▲ | EasyMark 13 hours ago | parent | prev | next [-] | |||||||
Or fork and make a few modifications to handle it? I have to admit I haven't looked at the code to see if this particular code would allow for that. | ||||||||
▲ | maleldil 12 hours ago | parent | prev [-] | |||||||
Not really. I deal with this everyday. If the library has a limit on the input size, it should mention this. | ||||||||
|