Comment by davedx

Comment by davedx a day ago

14 replies

"Ultra-efficient"

Searched the article, no mention of gzip, and how most of the time all that text data (html, js and css too!) you're sending over the wire will be automatically compressed to...... an efficient binary format!

So really, the author should compare protobufs to gzipped JSON

chillfox a day ago

Last time I was evaluating different binary serialization formats for an API I was really hoping to get to use one of the cool ones, but gzipped JSON just beat everything and it wasn't even close.

  • theshrike79 a day ago

    There are some compression formats that perform better than gzip, but it's very dependent on the data you're compressing and your timing requirements (is bandwidth or CPU more important to conserve).

    But in the end compressed JSON is pretty good. Not perfect, but good enough for many many things.

  • tmikaeld a day ago

    So a potential 4-9% difference..

    NOT worth it, especially if the whole infra is already using JSON.

    • bloppe 16 hours ago

      Per the post:

      > This can sound like nothing, but considering that Protobuf has to be converted from binary to JSON - JavaScript code uses JSON as its object literal format - it is amazing that Protobuf managed to be faster than its counterpart.

      Presumably the difference would be much larger for languages that can actually represent a statically-typed structure efficiently.

      Also, the tradeoffs have changed since Protobuf was invented. Network bandwidth has gotten cheaper faster than CPU bandwidth has, so the en/de-coding speed is more important than the packet size in many situations. And if you don't use gzip, Protobuf is much faster (especially in non-JS languages, and especially if you use fixed-size integer types instead of variants).

JodieBenitez a day ago

This is so obvious to me... JSON vs. JSON + mod_deflate is just night and day.

andersmurphy a day ago

Or streaming brotli/zstd json/html where the compression window can be used for the duration of the connection.

  • WorldMaker a day ago

    Brotli also benefits out-of-the-box in fresh compression windows with some common JSON patterns always in the Brotli static dictionary.

  • akoboldfrying a day ago

    But in that case the server/CDN won't be able to cache the gzipped forms of the individual files -- so probably a win for highly dynamic/user-specific content, but a loss for static or infrequently generated content.

MangoToupe a day ago

I would think that serialization/deserialization time would be the largest drawback of json (at least for serving APIs). Pretty much all the other pain points can slowly be ironed out over time, albeit with deeply ugly solutions.

  • thayne a day ago

    It depends on what your data looks like. If your content is mostly UTF-8 text, with dynamic keys, then I wouldn't expect protobuf to have much of an advantage over JSON for parsing to an equivalent structure. On the other hand, if you have binary data that needs to be base64 encoded in JSON, then protobuf has a significant advantage.

[removed] a day ago
[deleted]
coolThingsFirst 2 hours ago

How on earth would gzipping larger amount of data be more efficient than gzipping smaller amount of data?