- cross-posted to:
- lobsters
- cross-posted to:
- lobsters
There exists a peculiar amnesia in software engineering regarding XML. Mention it in most circles and you will receive knowing smiles, dismissive waves, the sort of patronizing acknowledgment reserved for technologies deemed passé. “Oh, XML,” they say, as if the very syllables carry the weight of obsolescence. “We use JSON now. Much cleaner.”


CSV >>> JSON when dealing with large tabular data:
1 can be solved with JSONL, but 2 is unavoidable.
No:
Just user Zarr or so for array data. A table with more than 200 rows isn’t ”human readable” anyway.
Yes…but compression
And with csv you just gotta pray that you’re parser parses the same as their writer…and that their writer was correctly implemented…and they set the settings correctly
Compression adds another layer of complexity for parsing.
JSON can also have configuration mismatch problems. Main one that comes to mind is case (in)sensitivity for keys.
Nahh your nitpicking there, large csvs are gonna be compressed anyways
In practice I’ve never met a Json I cant parse, every second csv is unparseable