Hacker News new | past | comments | ask | show | jobs | submit login

> JSON, Protobuf, etc. can be very efficiently streamed

Protobuf yes, JSON no: you can't properly deserialize a JSON collection until it is fully consumed. The same issue you're highlighting for serializing MessagePack occurs when deserializing JSON. I think MessagePack is very much written with streaming in mind. It makes sense to trade write-efficiency for read-efficiency. Especially as the entity primarily affected by the tradeoff is the one making the cut, in case of msgpack. It all depends on your workloads but Ive done benchmarks for past work where msgpack came up on top. It can often be a good fit for when you need to do stuff in Redis.

(If anyone thinks to counter with JSONL, well, there's no reason you can't do the same with msgpack).




The advantage of JSON for streaming is on serialization. A server can begin streaming the response to the client before the length of the data is known.

JSON Lines is particularly helpful for JavaScript clients where streaming JSON parsers tend to be much slower than JSON.parse.


Sorry, I was mentally thinking of writing mostly. With JSON the main problem is, as you say, read efficiency.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: