Remix.run Logo
dgan 4 days ago

i have to confess , i use Protobuffs for everything. They convert to pure python (a la dataclass), to json strings and to binary strings, so i literally shove it everywhere : network, logic, disk.

BUT when doing heavy computation (c++, not python !) don't forget to convert to plain vectors, Protobuffs are horribly inefficient

the__alchemist 4 days ago | parent [-]

Protobuf is fine if:

A: You control both ends of the serialized line, or: B: The other end of the line expects protobufs.

There are many [de]serialization scenarios where you are interfacing with a third party API. (HTTP/JSON web API, a given IC's comm protocol as defined in its datasheet etc)

dontlaugh 4 days ago | parent | next [-]

You can still use a protobuf schema to parse/generate JSON, in most cases.

dgan 4 days ago | parent | prev [-]

i think even if 3rd party API expects json, you could still map their models to proto ; i haven't encountered this case tho

might still be challenging to convince proto to output what you want exactly

the__alchemist 4 days ago | parent [-]

I don't understand then. Here is my mental model; as described, you can see why I'm confused:

JSON: UTF-8 Serialization format, where brackets, commas, fields represented by strings etc.

Protobuf: Binary serialization format that makes liberal use of varints, including to define field number, lengths etc. Kind of verbose, but not heinous.

So, you could start and end your journey with the same structs and serialize with either. If you try to send a protobuf to an HTTP API that expects JSON, it won't work! If you try to send JSON to an ESP32 running ESP-Hosted, likewise.

dgan 3 days ago | parent [-]

ah I think I understand your confusion. The proto package allows conversion between the binary messages and their json equivalent. So you can still use the proto objects in your code , only to send out json when required