Comment by mirzap

Comment by mirzap 11 days ago

18 replies

> 150,000 records — far past where JavaScript (and React) would crash with a stack overflow error

I think react-virtualized and stack tables can easily handle 1 million rows client-side without a problem (I saw the demo somewhere).

Web development is about convenience, and the speed of development is far more important than ultra optimizations. People simply don't care about super optimizations because servers are so fast these days, and bandwidth is cheap, almost free.

But it's an interesting project. Good luck.

tipiirai 11 days ago

> People simply don't care about super optimizations

Could be, but some optimizations in Nue really stand out. Check out bundle size, HMR speed, build speed, and especially repository size. An empty Next.js project weighs over 300MB, a Vite button is around 150MB, while a Nue SPA clocks in at just 2MB.

  • Timon3 11 days ago

    I would love to check out the bundle size! Will you ever create a fair comparison, where your Nue implementation has all the same features as the versions you compare it to? All the comparisons I've seen so far are deceitful since they implement much less. If Nue actually produces smaller bundles, why not create an actually fair comparison?

  • jeffhuys 11 days ago

    A Vite button? You mean Vue? Vite is just a bundler lol.

    Also, would like to see some comparisons to Preact, as that's an (almost) drop-in replacement for React with SUPER small bundles. I'd be impressed if you manage to beat that.

tipiirai 11 days ago

This exact demo will crash with vanilla JavaScript (in Chrome 134.0). This React would also crash — unless the computation relies on WASM

  • mirzap 11 days ago

    Make a demo with react-virtualized[0] and see if it crashes. Hint: It will not[1]. React can easily render 1 million rows with high performance without relying on WASM [2]

    Here is the demo of react-virtualized[3], in which I entered 10m as the row count and scrolled to the bottom without crashing.

    [0] https://github.com/bvaughn/react-virtualized

    [1] https://www.youtube.com/watch?v=1JoEuJQIJbs

    [2] https://medium.com/@priyankadaida/how-to-render-a-million-ro...

    [3] https://bvaughn.github.io/react-virtualized/#/components/Lis...

    *Update: Here I made a table with 1 million rows with search, filtering, and pagination. In plain Javascript:

    https://htmlpreview.github.io/?https://gist.githubuserconten...

  • Farseer_ 11 days ago

    Could you give a code example? Also, by crash, do you mean the mentioned stack overflow error?

    If so, why would the stack be involved when talking element count?

    • mirzap 11 days ago

      Because he constructs a giant JSON by joining individual entries. Rendering that directly on the DOM will always cause the performance issues (even at the 10k entries). That's why you need to use virtualized list, it can be done in plain JS or using libraries like react-virtualized.

      This works, plain JS 150k rows

          <style>
              #viewport {
                  height: 600px;
                  overflow-y: scroll;
                  position: relative;
                  border: 1px solid #ccc;
                  width: 400px;
                  margin: auto;
              }
      
              .item {
                  position: absolute;
                  left: 0;
                  right: 0;
                  height: 30px;
                  padding: 5px;
                  box-sizing: border-box;
                  border-bottom: 1px solid #eee;
                  font-family: Arial, sans-serif;
              }
          </style>
      
          <div id="viewport">
              <div id="content"></div>
          </div>
      
      
          <script>
              const viewport = document.getElementById('viewport');
              const content = document.getElementById('content');
              const itemHeight = 30;
              const totalItems = 150000;
      
              const items = Array.from({length: totalItems}, (_, i) => ({
                  id: i + 1,
                  name: `User #${i + 1}`
              }));
      
              content.style.height = `${totalItems * itemHeight}px`;
      
              function render() {
                  const scrollTop = viewport.scrollTop;
                  const viewportHeight = viewport.clientHeight;
                  const start = Math.floor(scrollTop / itemHeight);
                  const end = Math.min(totalItems, start + Math.ceil(viewportHeight / itemHeight) + 10);
      
                  content.innerHTML = '';
      
                  for (let i = start; i < end; i++) {
                      const div = document.createElement('div');
                      div.className = 'item';
                      div.style.top = `${i * itemHeight}px`;
                      div.textContent = items[i].name;
                      content.appendChild(div);
                  }
              }
      
              viewport.addEventListener('scroll', render);
              render();
          </script>
    • tipiirai 11 days ago

      The exact error is "Maximum call stack size exceeded" when the WASM- engine is replaced with this JS engine:

      https://github.com/nuejs/nue/blob/master/packages/examples/s...

      There is currently no demo about the crash, but you can setup this locally.

      • uasi 11 days ago

        `events.push(...arr)` puts all arguments on the call stack before calling the method, which causes the error. Don't push tens of thousands of items at once.

      • jeffhuys 11 days ago

        You're solving a problem nobody has. If you encounter this problem, you shouldn't think "ah, let's yeet the JS engine because it clearly isn't good enough for my awesome SPA", you should think "hm, maybe I shouldn't render 10000000000 records in the DOM".

        What's next? "Oh I have a memory leak, let's get a subscription on RAM modules and just keep adding them!"

      • [removed] 11 days ago
        [deleted]
      • [removed] 11 days ago
        [deleted]
  • wordofx 11 days ago

    No. Back when supporting ie 9 we had tables with a million rows and dozens of columns and it runs fine.

ykonstant 11 days ago

>People simply don't care about super optimizations because servers are so fast these days, and bandwidth is cheap, almost free.

Sigh.