compiskey a year ago

The indie web could easily exist via self host Docker from home and a central index to start for anyone that wants to peer.

Web apps are not programming but dependency management these days.

Making a private tribal bubble online is easier than ever. ooh he said the quiet part out loud

  • grumbel a year ago

    Hosting is the easy problem. The main problem is the discoverability on the users end. Individual websites just don't show up on Google anymore, worse yet, even if you find one you like, your browser provides no tools to keep track of it. Bookmarks are completely static, they don't inform you when a site changes. While RSS exists, it fails to integrate with the Web, it's basically its own separate thing, requiring separate sites and clients. It no longer has any browser integration either.

    People might not like social media, but the ability to discover content on there and keep track of it is just lightyears ahead of the regular old Web.

    Another big problem is the permanence of the content. Hosting your own content today is easy. Ensuring that it is still online 20 years from now is hard. Worse yet, even if you maintain your site, URLs still change over time, links break. An IndieWeb build with regular Web tools is just going to be a lot of 404 and "DNS address could not be found".

    We've been there, done that. It doesn't work. There is no point in trying to repeat it without some key innovation.

    • est a year ago

      > The main problem is the discoverability on the users end

      The main problem is harassment, spam, maintenance & op cost and DDoS

      • marginalia_nu a year ago

        Most of these go away if you remove social features.

        Maintenance is not a problem. Most of the supposed maintenance burden of self-hosting a website is FUD from cloud-provider marketing.

        Like I spend probably like 60 minutes a month on maintenance, and I operate a dang search engine on my server. Sure I'm not getting five nines, but who cares. (There is additional operational work, but that's orthogonal to how and where its hosted)

        • est a year ago

          > Most of the supposed maintenance burden of self-hosting a website

          Ever tried to host and then years later upgrade an indie wordpress site? It's painful for semi-technical people.

      • johnnymorgan a year ago

        Those are operational problems, he's talking about customer pain points,ie even with all those things list you still need to get people to the door for it to happen

        • est a year ago

          If your "social" site gets some attraction, you will get DDoS'd by some evil actor, sooner or later.

          • johnnymorgan a year ago

            100%, apologies I don't mean to downplay those issues, more helping comms between two parties.

            Those are legit operational issue to deal with but he was speaking more to product pain points, things that will motivate a customer.

    • compiskey a year ago

      I was building packet schlepping hardware for Nortel from 99-02; I know “how the internet works”.

      I was suggesting a change in social norms not technical.

      There’s no reason I need all the engineering of Facebook to directly share with a few friends and family. And having seen one social site after another fill up with more noise than signal, there’s little value in engineering sorting algorithms that attempt to preserve context rummaging through it all looking for needles in haystacks (ML).

      Feynman suggested modeling things literally is an inefficient path to discovery. Physicists achieved their best results dispensing with imagining the world as literal pulleys and levers and weights. We don’t need to literally model how to sort it all, just the right amount to assist others.

      Software engineers need to pivot from sorting stupid meme posts and put those brains on truly interesting ideas: https://youtu.be/ZSddchIGNG0

    • dizhn a year ago

      From what you're saying it sounds like what we need is a search engine for indie sites. There's already one for old-school sites but this would work better as a directory type of deal (or webring) of old times where the sites would still be crawled and indexed so the content is fresh. One of your requirements was never a requirement for the big web either. Namely that sites are up 20 years later. I don't think that's necessarily a problem.

      • wcedmisten a year ago

        Sounds like you may be interested in https://ooh.directory/

        • Ruthalas a year ago

          Note that ooh.directory only indexes blogs (that have RSS feeds). Personal/indie sites that have other structures are prohibited.

          Which is fine, but means it isn't a one-stop-shop for indie sites.

          • rchaud a year ago

            It's close enough. Ideally the RSS-friendly sites it links to will post their own links to non-RSS pages of interest.

        • thatoldfool a year ago

          https://wiby.me/ returns hits that "feel" like the old web, mostly plain HTML.

          • mxuribe a year ago

            I think said sites need to be submitted first, no? In any case - as a fan of the ol' web - i favor this type of search engine! :-)

          • dizhn a year ago

            That's the one I referred to as the old-school website one.

    • barbariangrunge a year ago

      The main engine for discoverability for small Indy sites is google, and google lost the fight against seo spam sometime in the last 5 years. Independent sites are hard to find now, so you have to promote them via the closed networks (at least, the ones that allow you to link outside themselves — many do not, or they restrict it)

  • hsn915 a year ago

    Docker is definitely not the answer for self-hosting on a mass scale. It's a thing that only technical people know about.

  • chrismorgan a year ago

    > self host … from home

    That requires an ISP that allows incoming connections, and preferably a static IP address. These are both increasingly uncommon.

    • codazoda a year ago

      You can use a dynamic IP with dynamic DNS easy enough. It works great for me.

      • Nursie a year ago

        Not if you’re on the wrong end of CGNAT :/

        Which I recently discovered we are.

        • counttheforks a year ago

          Switch to an ISP which actually sells you an internet connection.

          • Nursie a year ago

            This is not always an option, and 99% of people will never care.

            • counttheforks a year ago

              You cared enough to post about it.

              • Nursie a year ago

                Yes, but I didn't make the claim that "it's not an internet connection", because clearly it is. The vast majority of people will never care that theirs can't do server stuff, because they will never try to run one, or even know what it means. It's a perfectly fine internet connection for such folks.

        • mandis a year ago

          I have used DDNS before, but totally unfamiliar with what you refer to in your comment. Care to elaborate a bit?

          • squarefoot a year ago

            With IPv4 scarcity, many carriers had to employ NAT (network address translation) so that many users are mapped behind a single IP at the same time. This of course makes impossible to put a personal server on a local home network because although connecting to external addresses is still doable, any incoming packet wouldn't know which one of the users it should reach without explicit rules that the users have no access to.

            • counttheforks a year ago

              It also means you will frequently be blocked by services such as cloudflare if anyone else you're sharing an IP address with is infected with a spammy virus.

              • chrismorgan a year ago

                I’ve been behind CGNAT once. It was a miserable experience for this reason. (No idea how many people there were at the one IP address, but https://iknowwhatyoudownload.com/ reported on average ten or twenty hours of video per day being downloaded via BitTorrent from the address, none of which was from my endpoint.)

          • Nursie a year ago

            Carrier-grade NAT. Generally rolled out due to IPv4 exhaustion.

            Under CGNAT Your router does not get an externally reachable IP address from your ISP, as it sits behind ISP-level NAT router that assigns addresses to subscribers much like your home router assigns addresses to your home machines.

            So you can’t run any sort of externally reachable service at all.

            • alephu5 a year ago

              My ISP gives a dedicated IPv4 to anyone that asks, everyone else goes on CGNAT. Hardly anyone asks so they don't mind.

            • nivenkos a year ago

              Usually the ISP will provide some sort of port forwarding though.

              • olyjohn a year ago

                Like... what ISP would go out of their way to do that? Call me cynical, but I doubt there's an ISP that uses CGNAT who would forward a port to you. Like, they all do the total absolute minimum necessary to get you Internet access. Why would they bother creating some way to let you forward a port to your computer? No average person needs to do that anymore now that everything is cloud-based. I could be way wrong on this, but I just have my doubts...

                • shric a year ago

                  My ISP let me turn off CGNAT in my account settings (defaulted to on) and let me turn on IPv6 in my account settings (defaulted to off).

                • nivenkos a year ago

                  Yeah, it really depends on the ISP.

                  It's a shame how it's become so hard, the old internet is gone forever really.

    • KronisLV a year ago

      > That requires an ISP that allows incoming connections, and preferably a static IP address. These are both increasingly uncommon.

      Not if you can afford a cheap VPS to tunnel the traffic through with WireGuard or another solution. Here's my vague blog post on the topic: https://blog.kronis.dev/tutorials/how-to-publicly-access-you...

      For example, along the lines of https://www.scaleway.com/en/stardust-instances/ or whatever is the cheapest plan that your provider offers, or something you can find on LowEndBox: https://lowendbox.com/

      Though some people also suggested Cloudflare Tunnel or similar solutions, like: https://eddiez.me/working-around-cgnat/

      • senko a year ago

        > Not if you can afford a cheap VPS [...]

        At that point, you might as well host it on that cheap VPS instead of at home.

        • tecleandor a year ago

          I have both things.

          Tunneling have some advantages:

            - you don't expose your home IP
            - you can move your hosting easily (you could even have it in your phone)
            - you can host bigger stuff that you couldn't pay or would be too expensive "in the cloud" (you home NAS, a rack of servers, home assistant)
            - you could setup just one vps gateway with nginx or traefik and then set a bunch of tunnels for all your friends or several different sites...
          
          It has some interesting uses.
        • KronisLV a year ago

          > At that point, you might as well host it on that cheap VPS instead of at home.

          This is the better option sometimes.

          Buying multiple TB of storage in the cloud will typically be more expensive, whereas you can have ample storage locally on the cheap. The same goes for CPU/RAM, which you'll need for some applications (like local runners for CI servers) or games. Furthermore, you might prefer that the data stay on your local device for whatever reason (either there's a lot of it, like a voxel video game world files, or something else).

          My current homelab setup would cost me hundreds of dollars per month from AWS/GCP/Azure, even though it's just devices that would otherwise be e-waste. The round trip time between my local devices and where the servers are is around 20-30 milliseconds, so not at all bad, even decent for gaming.

          It's also far more cost effective than buying a static IP from my ISP and if someone DDoSes me, the VPS will probably be the first to crumble (though this is not guaranteed), which is an acceptable failure mode.

          Of course, for things that need better uptime, just going for the VPS directly is generally a good idea.

        • dspillett a year ago

          If your site/app is not latency sensitive, then hosting this way can be better than the straight VPS.

          Maybe you need better resources than a cheap VPS can offer, i.e. an experimental app that likes a chunk of RAM and IO so won't be happy on an oversold & contended host, but doesn't need low latency or massive bandwidth¹ so is happy with the connectivity found on such a cheap service.

          Another consideration is that if the VPS host dies and you are just using it as a proxy/tunnel, all you need to do is sign up for another, change DNS pointers and perhaps VPN config, and you are back up and running: no extra app/stack configuration needed, no need to restore content from backups⁴ to the new location, ete., so time+faf to service restoration is short.

          ----

          [1] It doesn't apply to all locations, as some are lucky to be in places where there has been sufficient infrastructure investment for FTTP to be inexpensively available, but for a lot of people a key limit to self-hosting is an asynchronous link: I know many with 1.4mbit or noticeably less upstream from home, the best I can reliably get here² is ~14mbit³ upstream. Most VPS providers that aren't ridiculously oversold can give you that, so their network would not be the outgoing connectivity bottleneck.

          [2] York, UK, not far from the city centre. FTTP is available in parts of the city and outskirts, but rollout is glacial so won't be near me any time soon

          [3] g.fast is available from the exchange/cab I'm connected through, but it is hit-and-miss and there is a risk I'll get less out of the nominally 150/30 standard than I do out of the 80/20 one.

          [4] though obviously: still keep good backups!

          • senko a year ago

            I somewhat agree with you (and with stuff like Tailscale it's incredibly easy to set up), with one big caveat, which you hinted at in the fourth footnote: you still need to maintain your home server.

            I am confident that my VPS provider (DigitalOcean) can maintain VPSs' better than I can (or want to), for example: automigrate for hadrware faults, cheap daily/weekly backup, and making sure it's up & running after power cycling the home (which is an infrequent but not unheard of situation where I live).

            Even for beefier setups, Hetzner (my go to provider for dedicated hw) still looks more appealing than setting it up at home, when I calculate hardware cost. If you have spare hardware sitting unused anyways, I'd agree.

            But on the main point we agree - if you know how to administer a server, it's easy to host, either home or on a VPS.

            The main drawback I see for common people is that they aren't gonna read up on nginx and postgres configs or set up minikube locally. That requires one-click setup for local use.

            • dspillett a year ago

              > more appealing than setting it up at home, when I calculate hardware cost

              That one can be complex. My home server does mainly private stuff so external bandwidth/latency isn't a big issue. Even accounting for cycling drives out¹² and other involuntary maintenance³ the cost of putting the machine together and keeping it running is lower than I'd have spent on the monthly costs of a similarly specced machine with a good provider.

              Power costs have made me reconsider that a bit in recent months, but hosting providers are putting up prices as they are subject to similar cost increases too so I'll wait a bit and see how that settles before making major changes.

              ----

              [1] I've had a few failures over the years, though thanks to RAID I've never actually lost data due to that⁴.

              [2] Also for space upgrades, but that is an issue for externally hosted options too.

              [3] The PSU went fzzzt early last year.

              [4] Actual (temporary) data loss due to human error has occurred, but backups resolved those occurrences.

    • uneekname a year ago

      I spend extra for a business connection. It's not for everyone, but if self-hosting is important to you it adds some assurance of reliability (and a static IP!)

    • CaptArmchair a year ago

      I can think of several good reasons to self host from your own home. Then again, those reasons are tangential to the core notion of self-publishing / independently publishing on the Web.

      There are two key goals here. First, being in full control of what you can put online and in which form. Second, being able to take whatever you've put online and move ship if you have/want to. Owning a domain name and having access to a VPS or shared hosting are great first steps.

      Even so, you're still relying on someone else to host your content for you. Depending on how much stock / trust you are willing to put in someone else's intentions and abilities - present and future - you may want to consider self-hosting on hardware you own. The trade-off being that you'll have to put your confidence in your own abilities, resources and skills. Which isn't a trade-off many people wouldn't necessarily be willing to make.

      As for the article. When it comes to the rise of social media over the past 15-20 years, well, Eternal September happened. The vast majority of modern-day influencers aren't interested in tinkering and self-hosting. They just took the opportunity of a platform which offered free - albeit limited - tools to share content with a growing audience. More content and a growing audience, in turn, attracted more people. Arguably, the vast majority of people today would have never really entered into massive online communities if it weren't for big tech, social media, cheap internet connections and cheap mobile devices.

      Sure, social media has eclipsed the, comparatively small, web of independent blogs and bloggers such as it was back in the mid-00s. That period of time is never going to come back. Despite all their flaws, vast social media networks aren't going to disappear either. Independent Web publishing may see a modest renaissance today with new protocols, the Fediverse and dirt cheap hosting. Then again, from a sociological perspective: the notion of online identity has always been evolving and has become ever more complex and fluid. People don't exclusively turn to one tool or one platform: they own several identities across many domains, platforms, accounts,... Age, social-economic background, culture, personality traits, peers,... are all deciding factors as to how people manifest that identity. In turn, collectively, this drives how the Web itself keeps evolving and morphing into something new in the future, which may or may not put a bigger focus on self publishing.

    • tjpnz a year ago

      Even if you don't have those problems how do you deal with petty individuals who don't like what you have to say? A residential ISP would drop you in a heartbeat, especially if you're violating the boilerplate hosting clause in your contract.

    • 0xCMP a year ago

      not really. you can get a very cheap VPS and reverse proxy via a wireguard, ssh, or stunnel connection. just make the home side connect to the server side.

      too much work? use tailscale.

      • chrismorgan a year ago

        Such a proxying arrangement loses a significant fraction of the “from home”-ness; not all, by any means, but a good chunk.

    • simonw a year ago

      Tailscale to an external VPS instance, run a proxy there.

      • aendruk a year ago

        …at which point you might as well just host the website on the VPS?

  • chriswarbo a year ago

    > self host Docker from home

    No need for such cruft. My site ran off a single darkhttpd binary for many years (it's now served from S3)

  • imachine1980_ a year ago

    Why you need docker for static site, you can use Caddy/go and static files in hugo for example and don't use docker

  • ThrowawayTestr a year ago

    That's fine if your site never has more than 5 concurrent users

    • chrismorgan a year ago

      For people thinking this way, I say: look at your actual available bandwidth, processing power and requirements to figure out what the bottleneck will be and whether it’ll matter.

      If you have a 10 Mbps uplink and an average page size of 1 MB, that’s approximately one visitor per second before it becomes saturated. Ten visitors at once will find the page takes ten seconds to load instead of one. A hundred, a hundred, though by then many will give up and you may want more deliberate load management or shedding (either limiting the number of connections you’re willing to open at a time and delaying accepting new connections, or returning a brief response that says “sorry, too much load right now, try again a bit later”, or dropping some connections outright).

      But if you have a 100 Mbps uplink and serve simple content with no images, you might have an average page size of 20 KB, and then you could handle over 500 page loads per second, which is almost certainly several orders of magnitude more than you get (that’s a billion page loads per month). But at 500 per second, that means you can only spend 2ms of processing power (all cores, so with perfect parallelism the single-core CPU time available will be 2ms times the number of cores you have) per page served before processing power becomes the bottleneck.

    • JohnFen a year ago

      For $5/mo, you can have your site professionally hosted with quotas high enough to cover the needs of the vast majority of people. If you get into the habit of editing your site on a "real copy" on your machine at home, and copy it from there to the professionally hosted site, you're still very resistant to being deplatformed. You just hire a different host, upload your real copy there, and change where your domain name points.

      • chrismorgan a year ago

        But that’s nothing to do with self-hosting from home; that’s just regular hosting with mirroring (which is generally a good practice anyway, and nigh-ubiquitous among users of static site generators), or with the machines-as-cattle philosophy (as distinct from machines-as-pets).

        • JohnFen a year ago

          True, I was just trying to point out that running your own independent website is possible even if your home situation makes it unrealistic. I think the "indie web" is important, and nobody who is interested should be discouraged if they can't host at home. There are ways to do it even if you have terrible internet service or a very restrictive ISP.

    • selfhoster11 a year ago

      Why should it? If it's a small personal site, you may be getting maybe a hundred hits a month at most unless it gets popular or featured on a link aggregator.

    • compiskey a year ago

      Who asked you to provide for the world? I never did.

bloppe a year ago

I've recently found that GitHub pages + Vitepress (or similar) are an excellent way to host a static site. You can write articles in markdown, push them to your repo, and within a few seconds it deploys the updated HTML to your site. GitHub pages is totally free hosting that can handle high traffic, Vitepress looks great and has a ton of plugins, AFAIK they'll only remove content if it infringes on copyrights and/or trademarks, and you can set up custom DNS and have your own domain. This all requires some basic coding skills, but could also be easily toolified if anybody here felt inclined to make it available to non-technical people.

  • doublepg23 a year ago

    I’ve played with Cloudflare Pages as well and it’s been a similarly great experience.

tuukkah a year ago

There's even a bridge from Indie Web to Mastodon: https://github.com/snarfed/bridgy-fed

  • rpastuszak a year ago

    That looks really cool, have you tried using it?

    • mxuribe a year ago

      Bridgy is not new. I used to use it to publish my old blog site's posts to destinations like Facebook and other silos back in the day...and it worked wonderfully amazing!!! I've not used it in a couple of years (since i killed off my blog), and back then while the Fediverse existed, mastodon software was not yet around....so not sure how good it connects with mastodon instances....but overall, Snarfed's code is rock solid. If its anything like it was back in the day, then it should be great for more recent stuff. Try it, and see!

      Disclaimer: For what its worth, I do not personally know Snarfed (Bridgy's main author), but i have followed him for years, and as stated above, have used his code in the past to great success.

      • tuukkah a year ago

        Bridgy Fed and Bridgy are separate projects, although they can work together. As I understand it, Bridgy is for connecting your website with your existing social media accounts, whereas Bridgy Fed is for using your website as your (federated) social media account.

        https://fed.brid.gy/docs

joeyjojo a year ago

I don't see anyone mention affiliate links when talking about the indie/old web. Affiliate links were such a crucial element to the early web 'surfing' experience. It's a pretty shitty web without those affiliate links, IMO, and the only means of discovery is generally through some giant platform.

  • spideymans a year ago

    Excuse me for my ignorance, but what are affiliate links? I never got to experience the early web.

    • aendruk a year ago

      I don’t recognize the term “affiliate links” but it used to be common to have a vaguely purposed Links section on a website.

      https://en.wikipedia.org/wiki/Link_page

      Or maybe they meant webrings?

      • tecleandor a year ago

        Yep, when talking about affiliate links I always think about commercial stuff: links with a referral that will give the site owner some commission if you buy something.

        It's either webrings or those widgets that people used to have in their blogs that were called things like "friends sites", "interesting sites" or things like that. (I used to have one but can't remember the name I gave it)

    • CM30 a year ago

      Ah in the olden days, quite a few hobbyist sites used to maintain a link of other recommended sites in the same niche in the sidebar, with the idea being that if you like the site you're reading, you'll probably enjoy these other sites too.

      This made them a very easy way to find new sites and communities, especially in fandoms like the ones for the Legend of Zelda, Pokemon, etc.

      You can see a working example in the right column of this longtime Pokemon site:

      https://www.dragonflycave.com/

      With all those tiny buttons and what not.

shp0ngle a year ago

I don’t understand what is IndieWeb

Their website seems like a MediaWiki instance.

Is it a software? A network of websites? A service? I cannot tell from the page and I am staring at it

MediaWiki is nice I guess (but kind of crap to maintain) but that’s not really it right?

  • mxuribe a year ago

    I would define "indie web" very briefly as a few things:

    - The indie *WEB* is a bunch of websites that leverage techniques (and yes, maybe open source software) to empower their site owners to be, well, more independent to publish their content (no reliance or less reliance) from silos like Facebook, etc. This is not the same but akin to how there is the Fediverse for social media, etc. Not so different from "indie musician", these are just websites that are "indie" (i.e. "indie" from the corporate silos).

    - Then, there are techniques that are used to be independent. That wiki website you likely stumbled upon, details lots of techniques, references software that can be used for your website, as well as tons of links to other documentaion and general references that can help. One essential technique - or maybe *approach* is a better word - would be "POSSE - Publish (on your) Own Site, Syndicate Elsewhere. See this page for its definition, and context: https://indieweb.org/POSSE I have found that over the years, folks wishing to homestead their content on websites that they control, starting with the concept of POSSE helps to consumne the ideas behind the "indie web".

    I should add that the indie web and the fediverse (of which the mastodon software has recently seen some limelight due to the twitter migration) are not new efforts. The Fediverse has been around since like 2008, while the indie web - though, more recent - is still happyily being used by people on the web for many years.

barbariangrunge a year ago

Is facebooks plan with Meta to replace the internet with their private internet, for practical purposes? Assuming vr and ar take over. Then use network effects to shut everyone else out?