Comment by cryptonector
Comment by cryptonector 20 hours ago
Here's the problem though: people have used the absence of tooling to justify the creation of new, supposedly-superior schemas and codecs that by definition have strictly less tooling available on day zero and which invariably turn out to be worse than ASN.1/DER were in 1984 because the authors also refused to study the literature to see what good ideas they could pick up. That's how we end up with:
- PB being a TLV encoding, just like DER, with all the same problems
(Instead PB should have been inspired by XDR or OER, but not DER.)
- PB's IDL requiring explicitly tagging every field of every data structure(!) even though ASN.1 never required tagging every field, and even though ASN.1 eventually adopted automatic tagging.
- PB's very naive approach to extensibility that is just like 1984 ASN.1's.
It's a mistake.Some people, when faced with a dearth of tooling, will write said tooling. Other people will say that the technology in question is a nightmare, and some of those people will then go on to invent a worse wheel.
I'd be ecstatic to use something other than ASN.1 if it wasn't a poor reinvention of it.
Protobuf ended up having more tooling in the end though, and it didn't take very long to get there. This is like how JSON replaced XML for many use cases.