Remix.run Logo
jcattle 4 days ago

If you don't care that much about the accuracy of your data (like only caring about a few decimals of accuracy in your floats), you don't generate huge amounts of data, you do not need to work with it across different tools and pass it back and forth, then yes CSV CAN be nice.

I wouldn't write it a love letter though. There's a reason that parquet exists.

christophilus 4 days ago | parent [-]

CSV is just a string serialization, so you can represent floats with any accuracy you choose. It’s streamable and compressible, so large files are fine, though maybe not “huge” depending on how you define “huge”. It works fine passing back and forth between various tools, so…

Without more specifics, I disagree with your take.

jcattle 4 days ago | parent [-]

It's only fine passing between various tools if you tell each tool exactly how they should serialize your values. Each tool will interpret column values in some way and if you want to work with those values in any meaningful way it will convert to their representation of the data type that is likely present in a column.

Going from tool to tool will leave you with widely different representations of the original data you've put in. Because as you said yourself. All of this data does not have any meaning. It's just strings. The csv and the tools do not care if one column was ms-epoch and another was mathematical notation floating point. It'll all just go through that specific tools deserialization - serialization mangle and you'll have completely different data on the other end.

bsghirt 4 days ago | parent [-]

How would you deserialise the entity "0.4288"?