I was totally down with this - a tight, efficient vector graphics format with After Effects integration, super - up until the point they said these animations were JSON files.
Huh?
Didn't I just read a whole blog post saying that one of their requirements was fast loading from disk, small bandwidth usage, etc? And that's what justified creating an entirely new image format from scratch, not something the internet is really suffering a shortage of already?
I know JSON is fashionable but that ending just seems kind of ridiculous. What's wrong with a binary format? Use Cap'n'Proto and mmap it! Saying numbers represented as text compresses well was the icing on the cake. No shit!
While a binary format would be better over the wire, once it's put into memory the format really doesn't matter. In fact I would argue that the decision to use JSON versus XML versus a binary format is entirely immaterial as it'll get cached once and likely never downloaded again (essentially). At least as far as Facebook's usage is concerned (they load these 6 once and never again).
Also I think you have to pick something human readable here. One of the great things about SVG is that it's in XML so you can read it, tweak it, etc. It's super handy when you want to make small changes that only affect the image. So I see using JSON as pretty handy.
Don't forget there are many types of formats out there that have 2 types: a binary and human readable type. One used typically during debugging and one used for optimal over-the-wire transport. I don't see why this couldn't be possible at a later time.
Edit: I don't understand the downvotes. Over-optimizing is a thing. For a one time download I don't see the handfuls of milliseconds you're saving mattering. In either case of receiving a binary or JSON you will store it in an optimal format on the device...
As someone who has to deal with mobile perf issues, JSON serialization/deserialization can take a lot more of a perf budget than you realize. A flatbuffer version would definitely be faster hands down.
With gzip sizing is about the same, but processing speed wise it is not.
Also if you gzip the binary version and the json version, the gzipped binary is still going to be much smaller.
Same with csv data, for download size it really pays off to preprocess and pack into arrays of binary structs before compression.
This is part of a bigger class of effects in preprocessing data before compression, it is often surprisingly effective. For example delta coding or columnar format (SoA vs AoS) can yield big improvements depending on data.
> As someone who has to deal with mobile perf issues, JSON serialization/deserialization can take a lot more of a perf budget than you realize. A flatbuffer version would definitely be faster hands down
With something that's grabbed once and cached likely forever I think the tiny bit of overhead for JSON won't make any difference. I've done this too and yes, of course a flatbuffer would be faster but the speed is just related to the initial download and caching. Beyond that you'd keep it in whatever structure is more efficient for the app to re-use.
Unless they store the cached version as a flatbuffer or similar format, you will suffer a penalty every app startup. And startup is where a lot of mobile perf is focused on. I haven't heard of SVG renderers storing SVG XML files in a more efficient format. It's like using jpeg2000 compressed images vs. PNG. And if you use a lot of them, it will add up.
> Unless they store the cached version as a flatbuffer or similar format, you will suffer a penalty every app startup. And startup is where a lot of mobile perf is focused on.
Why would you pull in data from one format and keep it in the same format forever if it's not optimal for the device AND it doesn't need to be modified and sent back? Whenever I work on mobile apps and I have to cache data coming over as JSON or XML I store it where I can quickly re-access and NOT in its original format.
This thread seems like a conversation about how you can shoot yourself in the foot when it's very easy to avoid.
> I haven't heard of SVG renderers storing SVG XML files in a more efficient format.
Me neither. Sounds like it could be doable. Not sure how this related to our conversation though.
Hey, zigzigzag! The reason we went with JSON initially was for ease to work with, debug, and general ability to tweak and make the development process for this project much easier. That's definitely not to say there aren't some notable improvements we can make to this library!
For those who use JSON regularly and aren't familiar with Protobufs (or one of the variants like Cap'n'Proto), they're an amazing binary serialization format that is tightly packed, encoding and decoding efficient, and designed around supporting protocol changes.
Fields are serialized to tag numbers instead of field names in the binary format, so it's easy to rename fields. This also saves a ton of space versus JSON.
Proto "messages", analagous to structs in other languages, are typesafe, negating the need for a separate schema definition language. They support optional and sum types too.
So many languages have great protobuf libraries available. (I'm hoping one becomes available for Rust soon--as well as gRPC.) It would be amazing if the browser vendors would come together and standardize it as a first class browser serializarion format. It'd save so much bandwidth to emit protos instead of JSON.
Huh?
Didn't I just read a whole blog post saying that one of their requirements was fast loading from disk, small bandwidth usage, etc? And that's what justified creating an entirely new image format from scratch, not something the internet is really suffering a shortage of already?
I know JSON is fashionable but that ending just seems kind of ridiculous. What's wrong with a binary format? Use Cap'n'Proto and mmap it! Saying numbers represented as text compresses well was the icing on the cake. No shit!