Comment by morshu9001
Comment by morshu9001 2 days ago
Yes but I don't want to worry about what parts of the spec are implemented on each end. If you removed all the unnecessary stuff and formed a new standard, it'd basically be protobuf.
Comment by morshu9001 2 days ago
Yes but I don't want to worry about what parts of the spec are implemented on each end. If you removed all the unnecessary stuff and formed a new standard, it'd basically be protobuf.
In what ASN1 application is protobuf spec too limited? I've used protobuf for tons of different things, it's always felt right. Though I understand certain encodings of ASN1 can have better performance for specific things.
Open types, constrained types, parameterized types, not needing tags, etc.
Numbers bigger than 64-bits, character sets other than Unicode (and ASCII), OIDs, etc.
These are only scalars that you'd encode into bytes. I guess it's slightly annoying that both ends have to agree on how to serialize rather than protobuf itself doing it, but it's not a big enough problem.
Also I don't see special ASN1 support for non-Unicode string encodings, only subsets of Unicode like ascii or printable ascii. It's a big can of worms once you bring in things like Latin-1.
I do not agree. Which parts are necessary depends on the application; there is not one good way to do for everyone (and Protobuf is too limited). You will need to implement the parts specific to your schema/application on each end, and if the format does not have the data types that you want then you must add them in a more messy way (especially when using JSON).