Comment by alganet

Comment by alganet 5 hours ago

2 replies

It seems to be convenient for some cases. A large object with many keys, for example.

I don't see it as particularly convenient if I want to stream a large array of small independent objects and read each one of them once, then discard it. The incremental parsed array would get bigger and bigger, eventually containing all the objects I wanted to discard. I would also need to move my array pointer to the last element at each increment.

jq and JSON.sh have similar incremental "mini-object-before-complete" approaches to parsing JSON. However, they do include some tools to shape those mini-objects (pruning, selecting, and so on). Also, they're tuned for pipes (new line is the event), which caters to shell and text-processing tools. I wonder what would be the analogue for that in a higher language.

benatkin 4 hours ago

This is more versatile than it seems at first glance. Under invariants, it shows that you have arrays/objects only being mutated, so you have stable references. You could use a WeakSet to observe new children of an item coming in. You also may not even need manage this directly - you could debounce and just re-render a UI component by returning a modified virtual DOM. Or if you had a visualization in d3, it would automatically notice which ones are new.