crazygringo 2 days ago

I dunno -- generally speaking, the Wayback Machine is a much better time travel experience than trying to recover a website from an old git commit.

Especially since it's not limited to only sites I've created...

And in this particular case, all the creator was looking for was old badge images, and they'd generally be in an images directory somewhere no matter whether the site was static or dynamic.

dang 2 days ago

Can I mention HN's custom time travel experience?

https://news.ycombinator.com/front?day=2025-08-31

(available via 'past' in the topbar)

  • coldfoundry 2 days ago

    Wow, to be honest I never knew that was there - just got back from the 2017s. Such a cool feature to support on-site without going through third parties, more sites should have official archival support like this!

    Thanks for the share.

codazoda 2 days ago

I have a bit of an internal struggle here. I use a site generator too but I struggle with the question, should I? I recently wrote about why I’m writing pure html and CSS in 2025.

https://joeldare.com/why-im-writing-pure-html-and-css-in-202...

  • jszymborski 2 days ago

    I'm not sure how using a static site generator would run counter to any of those points. You can simply generate the same website that you've written by hand.

    EDIT: Well perhaps the "build steps" one, but building my Hugo site just involves me running "hugo" in my directory.

unsungNovelty 2 days ago

There is also an interest to just start writing raw HTML which would make it a lit bit more hard but crazy with respect to what you can do.

I almost ended up doing it twice. Old links and time is what stops me.

Inspiration - https://ankarstrom.se/~john/articles/html/

3036e4 2 days ago

Plain text files and version control win again.

  • DeepYogurt 2 days ago

    KISS

    • marcosdumay 2 days ago

      Version control isn't really "simple". That said, neither is plain text nowadays.

      It may make sense to change a "S" there for "standard".

      • jszymborski 2 days ago

        Well, it's "relatively simple", as the alternatives either demand a superset of the requirements of static sites or replacements that are more complex.

      • sneak 2 days ago

        Once you have the main mental model, you realize that git (the main/core features, not fancy stuff like submodules or worktrees etc) is basically the simplest thing that is fit for purpose.

      • ars 2 days ago

        Version control can be very simple. Not everything requires the full power of git.

        Use ci from RCS, and that's about it. It makes a single file in the same directly as the file, no need to checkout files, or keep track of, well, anything. Just check in new versions as you make them.

        It's not the right answer in many cases (especially if you need multiple files), but for a single file? The simplicity can't be beat.

cosmicgadget 2 days ago

From the title I thought this was about taking trips down memory lane or seeing historical posts by others. But it seems to be more about seeing design (rather than content) from one's own site in years past. I hope I'm not the only one who would prefer not to see my embarrassing old designs and rather see my archive content rendered in the current (least cringe) template.

Liftyee 2 days ago

My initial thought was that the title was referring to web archive services like the Wayback Machine or archive.is , but the actual topic was equally relevant. I think time travel should work as long as all content is archived / checked in: no reliance on external services (is this the definition of "static site"?)

  • 01HNNWZ0MV43FF 2 days ago

    Static site also means no backend. Each request just serves a file unmodified from disk.

jesprenj 2 days ago

If your website is one static file you can use vim undo history to go back years in the past.

luxuryballs 2 days ago

interesting idea: a browser plugin that will cache and upload the final html/css of a page, with some logic to avoid duplicates and “extras” it could be a client side distributed archival system that captures the historical web in always static content

sedatk 2 days ago

Why do you need such a granular capability, especially when Internet Archive exists. What purpose does it serve?

  • plorkyeran 2 days ago

    > I mentioned this to Varun who asked if I had any screenshots of what it looked like on my website. My initial answer was “no”, then I looked at Wayback Machine but there were not pictures of the badges.

  • zoul 2 days ago

    A safe rollback on a Friday afternoon is a nice thing for sure :)

sharps_xp 2 days ago

is there an decentralized org to ensure that all of the js css we use today remain backward compatible decades from now? or are we just at the whim of these browser vendors?

  • mananaysiempre 2 days ago

    For some part, W3C is supposed to serve this role, so to the extent that WHATWG controls the web platform, yes, yes we are. Part of the problem is, it’s not clear who exactly is supposed to participate in that hypothetical “decentralized” organization—browser vendors do consult website operators, but on occasion[1] it becomes clear that they only care about the largest ones, whose interests are naturally quite different from the long tail (to the extent that it still exists these days). And this situation is in part natural, economically speaking, because of course the largest operators are the ones that are going to have the most resources to participate in that kind of thing, so money will inevitably end up being the largest determinant of influence.

    [1] https://github.com/w3c/webauthn/issues/1255

    • rafram 2 days ago

      That's an unfair characterization. WHATWG doesn't version the spec like W3C did, but it's no less backwards compatible. See their FAQ [1], or just load the 1996 Space Jam site [2] in your modern browser.

      [1]: https://whatwg.org/faq#change-at-any-time

      [2]: https://www.spacejam.com/1996/

      • mananaysiempre 2 days ago

        Thus far, WHATWG has mostly behaved benevolently, true. But because they have stayed benevolent for now doesn’t mean we’re going to be any less at their mercy the moment they decide not to. As the recent XSLT discussion aptly demonstrates, both browser vendors and unaffiliated people are quite willing to do the “pay up or shut up” thing for old features, which is of course completely antithetical to backwards compatibility.

  • AgentME 2 days ago

    The browsers and standards groups do prioritize backwards compatibility and have done a very good job at it. The only real web compatibility breakages I know of have to do with pre-standardized features or third-party plugins like Flash.

  • hypeatei 2 days ago

    The engines are open source, no? I don't think we should break websites on purpose but keeping everything backwards compatible does seem untenable for decades to come.

  • ars 2 days ago

    LibreOffice can open AppleWorks files from 1984.

    And if it couldn't, you could run these old programs in a VM, and I expect that to continue essentially forever, so I see no future problem viewings these browser files.

curtisblaine 2 days ago

If it builds. Author mentions he/she uses Eleventy, so there's always a possibility that current node / npm versions won't work with some ancient dependencies or with old style peer dependencies. Then it's a long bisection with nvm until you get the right version combo.

algo_lover 2 days ago

I don't get this? I can checkout an old commit of my dynamic server rendered blog written in go and do the same thing.

Sure I won't have the actual content, but I can see the pages and designs with dummy data. But then I can also load up one of several backups of the sqlite file and most likely everything will still work.

  • laurentlb 2 days ago

    Building old code and getting the same result is not always trivial to do.

    Potential issues:

    - If you have content in a database, can you able to restore the database at any point in time?

    - If you code has dependencies, were all the dependencies checked in the repository? If not, can you still find the same version you were using.

    - What about your tools, compilers, etc.? Sure some of them like Go are pretty good with backward compatibility, but not all of them. Maybe you used a beta version of a tool? You might need to find the same version of the tools you were using. By the way, did you keep track of the versions of your tools, or do you need to guess?

    Even with static websites, you can get into trouble if you referenced e.g. a JS file stored somewhere else. But the point is: going back in time is often much easier with static websites.

    (Related topic: reproducible builds.)

  • inetknght 2 days ago

    > Sure I won't have the actual content, but I can see the pages and designs with dummy data. But then I can also load up one of several backups of the sqlite file and...

    ... so it's useless to anyone except you, then?

    • plouffy 2 days ago

      Does it really need an /s