| ▲ | jeffbee 2 hours ago | |
JSON parsing is probably one of the most thoroughly optimized subsystems in the whole industry at this point. Obviously there are ways to encode the same data that are easier to parse (e.g. instead of absolute floating point coordinates, use integer deltas along a path in some reasonable CRS) but because this inefficient representation is so common and so long-standing the parsing is faster than people think. | ||
| ▲ | trgn 2 hours ago | parent [-] | |
the default parsers all load the entire thing in mem, which is not good. so you need a stream-based parser, which nobody does an effort to write/use for json. especially since geojson is a web format, and people just default to json.parse, which is blocking. and even then, even if you did use the custom one, it likely won't be a geojson-tailored one, so because key-order isn't guaranteed, any parser for geo-json will need to do some acrobatics to finding the reference-system, dealing with arbitrarily nested geometries etc.. it's a good format for what it is, but it's not a great geo-format. a geo format needs to be easily scannable and, even better, have a geometry index to be able to seek quickly. | ||