Remix.run Logo
IshKebab 14 hours ago

It also feels like only half the job to me. Reminds me of SAX "parsers" that were barely more than lexers.

flohofwoe 14 hours ago | parent [-]

I mean, what else is there to do when iterating over a JSON file? Delegating number parsing and UNICODE handling to the user can be considered a feature (since I can decide on my own how expensive/robust I want this to be).

skydhash 14 hours ago | parent | next [-]

That is what I like Common Lisp libraries. They are mostly about the algorithms, leaving data structures up to the user. So you make sure you got those rights before calling the function.

IshKebab 12 hours ago | parent | prev [-]

Extracting the data into objects. Libraries like Serde and Pydantic do this for you. Hell the original eval() JSON loading method did that too.

meindnoch 9 hours ago | parent [-]

Then you lose the ability to do streaming.

IshKebab 2 hours ago | parent [-]

True, but usually you only need that if your data is so large it can't fit in memory and in that case you shouldn't be using JSON anyway. (I was in this situation once where our JSON files grew to gigabytes and we switched to SQLite which worked extremely well.)

meindnoch 19 minutes ago | parent [-]

Actually, you'll hit the limits of DOM-style JSON parsers as soon as your data is larger than about half the available memory, since you'd most likely want to build your own model objects from the JSON, so at some point both of them must be present in memory (unless you're able to incrementally destroy those parts of the DOM that you're done with).

Anyhow, IMO a proper JSON library should offer both, in a layered approach. That is, a lower level SAX-style parser, on top of which a DOM-style API is provided as a convenience.