Comment by culi

Comment by culi 4 days ago

40 replies

> Yes, you must copy and paste content

Many people who maintain their own sites in vanilla web technologies tend to create reusable functions to handle this for them. It can generate headers and the like dynamically so you don't have to change it on every single page. Though that does kill the "no javascript required" aspect a lot of people like

Of course you could simply add a build step to your pure HTML site instead!

mixmastamyk 4 days ago

I recently learned the object tag can do what I wished for in the 90s... work as an include tag:

    <object data="footer.html"></object>
Turn your back for twenty-five years, and be amazed at what they've come up with! ;-)

Should reduce a lot of boilerplate that would get out of sync on my next project, without need for templating.

  • johnisgood 2 days ago

    Hey, I need to try this out, so it is like iframe except the frame part and all its issues?

  • liontwist 4 days ago

    Unfortunately that will require the client to make additional web requests to load the page, effectively doubling latency at a minimum.

    • chubot 4 days ago

      A few extra <object> in a blog post is a worthwhile tradeoff, if you're literally using raw HTML.

      - HTTP/1.1 (1997) already reuses connections, so it will not double latency. The DNS lookup and the TCP connection are a high fixed cost for the first .html request.

      - HTTP/2 (2015) further reduces the cost of subsequent requests, with a bunch of techniques, like dictionary compression.

      - You will likely still be 10x faster than a typical "modern" page with JavaScript, which has to load the JS first, and then execute it. The tradeoff has flipped now, where execution latency for JS / DOM reflows can be higher than network latency. So using raw HTML means you are already far ahead of the pack.

      So say you have a 50 ms time for the initial .html request. Then adding some <object> might bring you to 55 ms, 60 ms, 80 ms, 100 ms.

      But you would have to do something pretty bad to get to 300 ms or 1500 ms, which you can easily see on the modern web.

      So yes go ahead and add those <object> tags, if it means you can get by with no toolchain. Personally I use Markdown and some custom Python scripts to generate the header and footer.

      • mixmastamyk 4 days ago

        Yes, I’d add that not merely “raw html” but a file on disk can be served directly by Linux without context switches (I forget the syscall), and transferred faster than generation.

    • mixmastamyk 4 days ago

      Sounds like premature optimization for a simple page. If the objects are sized their regions should be fillable afterward without need to resize and be cached for subsequent access.

      • liontwist 4 days ago

        The other solutions are even easier and don’t double latency.

        > be cached for subsequent access.

        So now you need to setup cache control?

  • culi 4 days ago

    I didn't know you could use object tags in that way! Thanks. That seems like a great solution if you're cool with an extra request

  • mrweasel 4 days ago

    Couldn't you sort of do that using server side includes back en the 90s? Assuming that your web server supported it.

    • mixmastamyk 3 days ago

      Yes, and a Makefile was an option as well. But an include tag was a no-brainer not long after html was invented. Especially after img, link, applet, frame, etc were implemented.

8organicbits 4 days ago

I've adopted the idea that a blog post is archived when it's published; I don't want to tinker with it again. Old pages may have an old style, but that's OK, it's an archive. Copy/paste works great for this.

The only reason I use a blog engine now (Hugo) is for RSS. I kept messing up or forgetting manual RSS edits.

arkh 4 days ago

Or, let me be cheeky: you could add some `<php include('header.html')?>` in your html.

lelanthran 4 days ago

> It can generate headers and the like dynamically so you don't have to change it on every single pa

Yeah, I noped out of that and use a client-side include (webcomponent) so that my html can have `<include-remote remote-src='....'>` instead.

Sure, it requires JS to be enabled for the webcomponent to work, but I'm fine with that.

See https://www.lelanthran.com for an example.

[EDIT: Dammit, my blog doesn't use that webcomponent anymore! Here's an actual production usage of it: https://demo.skillful-training.com/project/webroot/ (use usernames (one..ten)@example.com and password '1' if you want to see more usage of it)]

  • culi 4 days ago

    yeah clearly there's a lot of ways to solve this issue if javascript is enabled. But there's a big overlap between the folks who wanna use vanilla web technologies and the folks who want their site to run without javascript

spoonfeeder006 4 days ago

Isn't using React with a static site generator framework basically the same thing but better?

  • culi 4 days ago

    Not remotely! Unless you meant Preact. React ships an entire rendering engine to the front-end. Most sites that use React won't load anything if javascript isn't enabled

    • [removed] 4 days ago
      [deleted]
  • mrweasel 4 days ago

    Then you'd have to learn React, and for many of us the point is that we really don't want to learn React, or other frontend frameworks.

  • datavirtue 4 days ago

    Yes, if you want to throw up in your mouth.

    • [removed] 4 days ago
      [deleted]
  • realusername 4 days ago

    In theory yes, in practice good luck maintaining that if you are just a solo blogger.

    I doubt your blog would last a single month without some breaking change of some sort in one of the packages.

    • spoonfeeder006 3 days ago

      you mean npm packages? why would you need to update those anyhow?

      • realusername 3 days ago

        Because at some point it will cease to work? It needs upgrades like any other project.

        Every upgrade in the JS world is very painful.

  • lmm 4 days ago

    Yes, it is. Unfortunately HN has a crazy bias against JavaScript (the least crazy part of the web stack) and in favour of HTML and CSS, even though the latter are worse in every meaningful way.

    • dickersnoodle 3 days ago

      It isn't crazy, judging by the number of times I've seen posts here and on other blogs talking about a 100k web page ballooning to 8Mb because of all the Javascript needed to "collect page analytics" or do user tracking when ads are included. Granted that may not be needed for personal websites, but for almost anything that has to be monetized you're going to get stuck with JS cancer because some sphincter in a suit needs for "number to go up".

      • lmm 2 days ago

        > I've seen posts here and on other blogs talking about a 100k web page ballooning to 8Mb because of all the Javascript needed to "collect page analytics" or do user tracking when ads are included

        Perfect example. HN will see a page with 6Mb of images/video, 1Mb of CSS and 200Kb of JavaScript and say "look at how much the JavaScript is bloating that page".

    • oneeyedpigeon 4 days ago

      I don't even know where to begin with the pretence that you can compare HTML with JS and somehow conclude that one is 'better' than the other. They are totally different things. JS is for functionality, and if you're using it to serve static content, you're not using it as designed.

      • lmm 4 days ago

        I don't particularly care about "designed for". If you've got to serve something to make the browser display the static content you want it to, the least unpleasant way to do so is with JS.