This is the kind of outcome Section 230 (of US' Communications Decency Act of 1996) was intended to prevent.
It's an unpopular opinion in tech circles, but this right here is why I support the very maligned Section 230—in fact, I wish it'd go much farther. I think it's a core human right to talk on the internet and to create websites: a right that should be protected, in practice, by granting powerful immunities to third-parties, like forum administrators. Else you end up with chilling effects like this one, chilling things no reasonable free society would want to chill.
> This is the kind of outcome Section 230 (of US' Communications Decency Act of 1996) was intended to prevent.
I'm not so sure about that. Section 230 makes it so that if a user posts something that is illegal or tortious it is the user who is the one that faces charges or a lawsuit, not the forum owner.
But that doesn't mean that if you become aware of illegal material on your forum that was posted by a user you can just ignore it. You still have to remove it.
It sounds like it is the latter that is taking up too much of their resources.
With containerization and cheap cloud instances, people can much more easily deploy their own instance of a service like this such that only they (and maybe their trusted friends) can log in. My wife has a private paste bin that she runs on her own subdomain. There are no concerns about content moderation because all of the content is hers, and traffic is so minimal that one VPS can run a bunch of these little services.
The paste bin contents are visible to the public at large. The visibility of the content is no different than if she'd picked a public paste bin service--it just means nobody has to moderate anything because the uploader is the service owner.
maybe? but they also feel like completely different use-cases. It would be nice if we could solve the service discoverability problem and the service utilization problem independently. Just like the vast majority of businesses shouldn't be high-growth startups, most resources & communities should be intimate and relatively private.
If it is right to have a website how far do you want to go?
* You are forbidden from turning off a shared server
* You are forbidden from removing a colocated server
* You are forbidden from disconnecting an ISP customer
* You are forbidden from closing a bank account
* You are forbidden from refusing payments from this bank
People who want to censor you just go further up the chain if they don't like what you're saying.
Huh? You're really confused. Nothing in the 1st Amendment or Section 230 obligates *private parties* from hosting content they don't like! (Actually, they have an affirmative 1st Amendment right not to!) Section 230 immunizes them from certain legal liabilities from the consequences of hosting other people's content; but it obligates nothing.
The original post does not clearly specify the source of the infringement judgments. Was it a regulatory body contacting a housing provider? Was it the housing provider proactively policing content on their servers? Not clear to me.
Commenting on 230 seems reasonable, although I didn't catch what jurisdiction this hoster or site I owner are in.
Commenting on censorship power by signature l strong-arming the connoisseur that facilitate access to the internet seems reasonable too. But why reply to the 230 post with it, like you're refuting their issue. This could just be a different thread.
The intention and the application of the law are very far apart. If you strengthen the intention, you're not actually reinforcing it, you're reinforcing the application. It would mainly help further consolidate the power of Big Tech because they're the only ones who can afford to legally apply the law.
Section 230 needs to go because people get banned or shadow banned for all types of reasons. What's the point if it doesn't protect MathB.in and freedom of speech? Shouldn't we just start over?
>Section 230 needs to go because people get banned or shadow banned for all types of reasons.
What does people getting banned or shadow-banned have to do with this?
Section 230 is about protecting the host of 3rd-party content from liability. Beyond that, the host still has their rights to choose what they want to host, enforce what rules they want to enforce, and ban/shadow-ban whoever they want.
With or without Section 230, private companies can still ban you whenever they please, for whatever reason they please. As is their right to do so.
If anything, adding additional liability to hosts (via repealing Section 230) will likely make them even more zealous in who they decide to ban.
> Section 230 needs to go because people get banned or shadow banned for all types of reasons. What's the point if it doesn't protect MathB.in and freedom of speech?
You are yet another person who needs to read the "You Are Wrong About Section 230" TD article, because you're badly misinformed about what it does. Section 230 has nothing whatsoever to do with operators banning users: they can ban or not ban, with or without Section 230. It has basically nothing to do with freedom of speech either.
All Section 230 does-- and it's making me crazy that so many millions of people willfully misunderstand this very simple point-- is ensure that admins who choose to moderate face no additional liability compared to those who don't.
You're also badly misinformed and trying to gaslight me if you think Section 230 has nothing to do with freedom of speech and content moderation. The article you're recommending is from 2020, before the Supreme Court heard the case in 2023. This means you're actively spreading misinformation or just too incompetent.
Safety concerns are, as predicted by cypherpunks,being used to censor the internet and crack down on all independent platforms that dare operate outside big tech.
I think the bigger issues for independent platforms are 1) bad actors and 2) people staying on big platforms.
Bad content is more than a legal issue. If you have a public forum, you don't want people publicly posting hate and gore, being creepy, and DM-ing others threats. Your viewers will see hate/gore/creeps/threats, leave your site, tell others not to visit, and if the content is bad enough be scarred. You will be scarred. Your site will become filled with awful people. Then there's the issue of spam, which you must block simply because it will use up all your site's resources.
But even with those issues, there are many, many independent platforms that exist today. The problem is that most of them are quiet. Why would someone post on a tiny forum, since almost nobody would read it? Most people don't, they either write in their journal, or post in a big forum where they're more likely to get attention.
I think reachability specifically is the biggest issue, because you can make a forum invite-only, and this effectively reduces the amount of bad content to whatever you can manage (too overwhelmed? Invite less people). But people are even less likely to get an invitation to a forum, even if it's as simple as sending an email.
When I think of creating a forum, I worry about strongly-worded emails from the law-enforcement and/or my hosting-provider. But I don't worry about prosecution, I worry about the emotional toll of just seeing those letters, the emotional toll of seeing the content, the effort of finding and removing it, and the effort of blocking (mundane) spam. Even then I'd still host a forum if I expected it to become popular, but I expect it to become deserted.
The point I gathered was that while security and compliance burdens mentioned in the article are legitimate they are substantial enough of an increasing burden to outweigh the benefits for a solo maintainer.
What does this have to do with the OP? It’s not reasonable for his server to be taken offline every time someone decides to post some dox on it. Small community administrators don’t have the means to do round the clock moderation. The purpose of these regulations is to make sure only big tech can run an online service.
“Your business model is not my problem” is a thing that is popular on HN/Reddit. The crows have come home from the pasture and the cows have returned to roost.
I know the law is often behind the trend because of the rate tech develops, but surely the old analogy for all technology is postal mail.
Last time I checked, the postal service had no responsibility or requirement that they don't distribute certain messages or ideas? In some cases the government can ask to intercept them, but there's no regulation requiring them to scan letters for banned content.
Why don't these same rules apply to online technology?
> Last time I checked, the postal service had no responsibility or requirement that they don't distribute certain messages or ideas?
Technically the Comstock Act hasn't been repealed in the US, and Republicans have been talking about enforcing it again. Democrats heard this and did nothing about it, because of course they did.
It's even the opposite: the US Postal Service is *positively obligated* by the 1st Amendment to deliver content without discrimination. E.g. Lamont v. Postmaster General (1965) (US post can't create friction for US citizens subscribing to Soviet propaganda newspapers by mail).
I’m not going to discuss the merits of the rules, but postal mail seems a terrible comparison to the web. Global visibility, instant transmission of ideas, effectively free (as in beer) distribution.
Yeah, I think this is one of those domains where frictions (or lack thereof) matter. It is much faster, cheaper, and easier to send out a significant volume of material on the Internet than through the mail. In theory, somebody could send a bunch of obscene trolling letters, but having to lick all those envelopes seems to deter most of the people who'd be otherwise tempted.
An ISP is closer to the post office than a pastebin site. A pastebin site is closer to a factory that produces and ships content to all that order it, and thus responsible for what they ship.
It's unfortunate that things are the way they are, but I'm not sure there's a better option. If you give an inch, abusers will take a mile.
I think AI is well suited to this role, especially with new models being cable of learning and updating their weights as they go without needing retraining/finetuning.
Because there are far more places to apply leverage, and they know they can get away with it precisely because online is "different". Why? Because they said so.
I think the results would be similar if anyone was allowed to create their own mail delivery service. The politicians would create regulations so most people would not be allowed to run his mail delivery, and finally only a few regulated services like the postal service would remain.
You’ve made a fatal assumption that what is “acceptable” content is immutable and can’t be retroactively changed so that safe content is now unsafe content.
They’ll never be free of the costs of content curation
That would likely require reviewing every item in the "time capsule", even in the parts that have passed moderation. Moderation is not perfect, and there is likely a risk that something crops up that has been missed, with more legal troubles down the line.
Hence they offer to make copies of whatever material you might consider worth keeping, before the shutdown. Likely you can copy the entire site and keep it as a time capsule, public or not.
In case anyone is looking for an alternative - I've launched https://latex.to here on HN last year.
Since then I've added the option to save formulas in your browser via IndexedDB.
Almost to a weird extent, this is the only software that most of us will ever be able to write, code that we could write the MVP for in a day. Because anything else is too hard to see to completion. There's something to be said about going for that MVP and then iterating.
Having done some weekend projects, you get it done in a weekend with usually two approaches:
1. Use "do it all for you tools" that do all the hard parts (e.g. Ruby on Rails+lots of gems, Django+lots of libraries, Drupal/Joomla + plugins, etc)
2. Cut scope to a huge degree (don't have any of those things, no users, just "if you make this request then X data is stored and you get redirected to a view").
This math-oriented pastebin was probably the second.
Time to archive as many links as possible. I know some people who have used Mathb.in to draft (or publish!) short, insightful articles on various topics in mathematics. Since I last read these articles as an undergraduate student, it may take a while to find the links. Thankfully, there are at least two weeks worth of time to find them.
Thank you susam for 13 years of continuous service.
I made a partial replacement that doesn't allow user-submitted content at https://davidlowryduda.com/static/MathShare/. It just stores the content in the URL, and is limited by URL size limits. In practice this means you have approximately one page of text.
I wrote about making this at https://davidlowryduda.com/mathshare/. I was trying out the LLM interaction that made it near the top of HN recently, and it worked very well.
I would be very interested in hearing the types/examples of moderation concerns that users posted. This has been a worry of mine whenever I accept UGC on a website, and it might be good for anyone who plans on hosting a similar service, a heads up on what to screen for.
I am very sad for the loss of this service and thank you for work you have had to go through to keep it online.
> Another alternative would be to encode the content of a MathB.in post and embed it into the URL, which could then be distributed with others. Upon visiting the URL, the application would read the encoded content from the URL, decode it, and render it.
I made something like this a couple of years ago; it was only 60-something lines long. Very simple and a bit of fun to play with: https://github.com/ea935/maths
Why not ask for a volunteer to transfer site ownership to? It's not ideal: the volunteers have to be vetted, and the selected one may still end up malicious, or nobody may be selected. But it's better than shutting down the site without trying.
the UK always was the most censor-happy western country. Nobody remembers Ulysses, Lolita, Last Exit to Brooklyn? Or their British Board of Film Classification?
This is the kind of outcome Section 230 (of US' Communications Decency Act of 1996) was intended to prevent.
It's an unpopular opinion in tech circles, but this right here is why I support the very maligned Section 230—in fact, I wish it'd go much farther. I think it's a core human right to talk on the internet and to create websites: a right that should be protected, in practice, by granting powerful immunities to third-parties, like forum administrators. Else you end up with chilling effects like this one, chilling things no reasonable free society would want to chill.
> This is the kind of outcome Section 230 (of US' Communications Decency Act of 1996) was intended to prevent.
I'm not so sure about that. Section 230 makes it so that if a user posts something that is illegal or tortious it is the user who is the one that faces charges or a lawsuit, not the forum owner.
But that doesn't mean that if you become aware of illegal material on your forum that was posted by a user you can just ignore it. You still have to remove it.
It sounds like it is the latter that is taking up too much of their resources.
With containerization and cheap cloud instances, people can much more easily deploy their own instance of a service like this such that only they (and maybe their trusted friends) can log in. My wife has a private paste bin that she runs on her own subdomain. There are no concerns about content moderation because all of the content is hers, and traffic is so minimal that one VPS can run a bunch of these little services.
I think reverting to only talking to your small social circle instead of the public at large might be one of those chilling effects the OP mentioned
The paste bin contents are visible to the public at large. The visibility of the content is no different than if she'd picked a public paste bin service--it just means nobody has to moderate anything because the uploader is the service owner.
maybe? but they also feel like completely different use-cases. It would be nice if we could solve the service discoverability problem and the service utilization problem independently. Just like the vast majority of businesses shouldn't be high-growth startups, most resources & communities should be intimate and relatively private.
> people can much more easily deploy
But not easily, so this still won't happen.
free society hehe
If it is right to have a website how far do you want to go?
* You are forbidden from turning off a shared server * You are forbidden from removing a colocated server * You are forbidden from disconnecting an ISP customer * You are forbidden from closing a bank account * You are forbidden from refusing payments from this bank
People who want to censor you just go further up the chain if they don't like what you're saying.
Huh? You're really confused. Nothing in the 1st Amendment or Section 230 obligates *private parties* from hosting content they don't like! (Actually, they have an affirmative 1st Amendment right not to!) Section 230 immunizes them from certain legal liabilities from the consequences of hosting other people's content; but it obligates nothing.
Yes. If you are immune due to that the censors lean on your host, your isp, and your bank.
The original post does not clearly specify the source of the infringement judgments. Was it a regulatory body contacting a housing provider? Was it the housing provider proactively policing content on their servers? Not clear to me.
Commenting on 230 seems reasonable, although I didn't catch what jurisdiction this hoster or site I owner are in.
Commenting on censorship power by signature l strong-arming the connoisseur that facilitate access to the internet seems reasonable too. But why reply to the 230 post with it, like you're refuting their issue. This could just be a different thread.
The intention and the application of the law are very far apart. If you strengthen the intention, you're not actually reinforcing it, you're reinforcing the application. It would mainly help further consolidate the power of Big Tech because they're the only ones who can afford to legally apply the law.
Section 230 needs to go because people get banned or shadow banned for all types of reasons. What's the point if it doesn't protect MathB.in and freedom of speech? Shouldn't we just start over?
>Section 230 needs to go because people get banned or shadow banned for all types of reasons.
What does people getting banned or shadow-banned have to do with this?
Section 230 is about protecting the host of 3rd-party content from liability. Beyond that, the host still has their rights to choose what they want to host, enforce what rules they want to enforce, and ban/shadow-ban whoever they want.
With or without Section 230, private companies can still ban you whenever they please, for whatever reason they please. As is their right to do so.
If anything, adding additional liability to hosts (via repealing Section 230) will likely make them even more zealous in who they decide to ban.
I'm very sorry to say this but you don't understand Section 230.
> It would mainly help further consolidate the power of Big Tech because they're the only ones who can afford to legally apply the law.
This is what would happen if Section 230 were repealed.
> Section 230 needs to go because people get banned or shadow banned for all types of reasons. What's the point if it doesn't protect MathB.in and freedom of speech?
You are yet another person who needs to read the "You Are Wrong About Section 230" TD article, because you're badly misinformed about what it does. Section 230 has nothing whatsoever to do with operators banning users: they can ban or not ban, with or without Section 230. It has basically nothing to do with freedom of speech either.
All Section 230 does-- and it's making me crazy that so many millions of people willfully misunderstand this very simple point-- is ensure that admins who choose to moderate face no additional liability compared to those who don't.
"You Are Wrong About Section 230":
https://www.techdirt.com/2020/06/23/hello-youve-been-referre...
You're also badly misinformed and trying to gaslight me if you think Section 230 has nothing to do with freedom of speech and content moderation. The article you're recommending is from 2020, before the Supreme Court heard the case in 2023. This means you're actively spreading misinformation or just too incompetent.
Here's a more recent source instead of the Big Tech propaganda you're trying to push, https://www.aclu.org/news/free-speech/section-230-is-this-th...
??? Your op argues that 230 most go but you're linking an unabashedly pro 230 article.
Third party immunity creates a strangers-on-train-loophole.
Safety concerns are, as predicted by cypherpunks,being used to censor the internet and crack down on all independent platforms that dare operate outside big tech.
I think the bigger issues for independent platforms are 1) bad actors and 2) people staying on big platforms.
Bad content is more than a legal issue. If you have a public forum, you don't want people publicly posting hate and gore, being creepy, and DM-ing others threats. Your viewers will see hate/gore/creeps/threats, leave your site, tell others not to visit, and if the content is bad enough be scarred. You will be scarred. Your site will become filled with awful people. Then there's the issue of spam, which you must block simply because it will use up all your site's resources.
But even with those issues, there are many, many independent platforms that exist today. The problem is that most of them are quiet. Why would someone post on a tiny forum, since almost nobody would read it? Most people don't, they either write in their journal, or post in a big forum where they're more likely to get attention.
I think reachability specifically is the biggest issue, because you can make a forum invite-only, and this effectively reduces the amount of bad content to whatever you can manage (too overwhelmed? Invite less people). But people are even less likely to get an invitation to a forum, even if it's as simple as sending an email.
When I think of creating a forum, I worry about strongly-worded emails from the law-enforcement and/or my hosting-provider. But I don't worry about prosecution, I worry about the emotional toll of just seeing those letters, the emotional toll of seeing the content, the effort of finding and removing it, and the effort of blocking (mundane) spam. Even then I'd still host a forum if I expected it to become popular, but I expect it to become deserted.
[flagged]
The point I gathered was that while security and compliance burdens mentioned in the article are legitimate they are substantial enough of an increasing burden to outweigh the benefits for a solo maintainer.
What does this have to do with the OP? It’s not reasonable for his server to be taken offline every time someone decides to post some dox on it. Small community administrators don’t have the means to do round the clock moderation. The purpose of these regulations is to make sure only big tech can run an online service.
“Your business model is not my problem” is a thing that is popular on HN/Reddit. The crows have come home from the pasture and the cows have returned to roost.
I know the law is often behind the trend because of the rate tech develops, but surely the old analogy for all technology is postal mail.
Last time I checked, the postal service had no responsibility or requirement that they don't distribute certain messages or ideas? In some cases the government can ask to intercept them, but there's no regulation requiring them to scan letters for banned content.
Why don't these same rules apply to online technology?
> Last time I checked, the postal service had no responsibility or requirement that they don't distribute certain messages or ideas?
Technically the Comstock Act hasn't been repealed in the US, and Republicans have been talking about enforcing it again. Democrats heard this and did nothing about it, because of course they did.
https://en.m.wikipedia.org/wiki/Comstock_Act_of_1873
It's even the opposite: the US Postal Service is *positively obligated* by the 1st Amendment to deliver content without discrimination. E.g. Lamont v. Postmaster General (1965) (US post can't create friction for US citizens subscribing to Soviet propaganda newspapers by mail).
https://en.wikipedia.org/wiki/Lamont_v._Postmaster_General
(It's also prohibited from opening mail without a warrant, but that's an orthogonal question).
Please read this article https://www.techdirt.com/2020/06/23/hello-youve-been-referre... then re-consider your comment.
I’m not going to discuss the merits of the rules, but postal mail seems a terrible comparison to the web. Global visibility, instant transmission of ideas, effectively free (as in beer) distribution.
Yeah, I think this is one of those domains where frictions (or lack thereof) matter. It is much faster, cheaper, and easier to send out a significant volume of material on the Internet than through the mail. In theory, somebody could send a bunch of obscene trolling letters, but having to lick all those envelopes seems to deter most of the people who'd be otherwise tempted.
An ISP is closer to the post office than a pastebin site. A pastebin site is closer to a factory that produces and ships content to all that order it, and thus responsible for what they ship.
It's unfortunate that things are the way they are, but I'm not sure there's a better option. If you give an inch, abusers will take a mile.
I think AI is well suited to this role, especially with new models being cable of learning and updating their weights as they go without needing retraining/finetuning.
Because there are far more places to apply leverage, and they know they can get away with it precisely because online is "different". Why? Because they said so.
I think the results would be similar if anyone was allowed to create their own mail delivery service. The politicians would create regulations so most people would not be allowed to run his mail delivery, and finally only a few regulated services like the postal service would remain.
> If you have any important posts that you would like to keep, now is the time to copy and save them for yourself.
Have you considered putting the site into "readonly-mode"? Or are you afraid of skeletons in then closet?
Everything about this post makes sense until I reach the part where they are closing the site completely.
Why not just leave a read only, static, site up and running so that these links and paste don’t disappear?
The moderation was the hard part, and for already moderated posts this cost has already been sunk.
Further: if the site is just a time capsule, the operating cost would be near zero.
Why not just leave the history in place?
You’ve made a fatal assumption that what is “acceptable” content is immutable and can’t be retroactively changed so that safe content is now unsafe content.
They’ll never be free of the costs of content curation
The odds of that applying to a math site are extremely low.
But not zero
https://en.m.wikipedia.org/wiki/Indiana_pi_bill
That would likely require reviewing every item in the "time capsule", even in the parts that have passed moderation. Moderation is not perfect, and there is likely a risk that something crops up that has been missed, with more legal troubles down the line.
Hence they offer to make copies of whatever material you might consider worth keeping, before the shutdown. Likely you can copy the entire site and keep it as a time capsule, public or not.
That still requires ongoing non-zero effort and time, with no guarantees of zero problems for them.
In case anyone is looking for an alternative - I've launched https://latex.to here on HN last year. Since then I've added the option to save formulas in your browser via IndexedDB.
> After coding all through the night, as the sun rose on Sunday, 25 March 2012, the website was ready.
Every now and then I hear people having developed a fully fledged software overnight, or on the weekend only... So far, I did not succeed to do that.
Almost to a weird extent, this is the only software that most of us will ever be able to write, code that we could write the MVP for in a day. Because anything else is too hard to see to completion. There's something to be said about going for that MVP and then iterating.
You need to focus on a single feature and relegate other issues either to existing services or for the next iteration.
Having done some weekend projects, you get it done in a weekend with usually two approaches:
1. Use "do it all for you tools" that do all the hard parts (e.g. Ruby on Rails+lots of gems, Django+lots of libraries, Drupal/Joomla + plugins, etc)
2. Cut scope to a huge degree (don't have any of those things, no users, just "if you make this request then X data is stored and you get redirected to a view").
This math-oriented pastebin was probably the second.
Not "fully fledged" but "enough to be useful". After that you start having feedback that can guide gradual improvements.
Time to archive as many links as possible. I know some people who have used Mathb.in to draft (or publish!) short, insightful articles on various topics in mathematics. Since I last read these articles as an undergraduate student, it may take a while to find the links. Thankfully, there are at least two weeks worth of time to find them.
Thank you susam for 13 years of continuous service.
I made a partial replacement that doesn't allow user-submitted content at https://davidlowryduda.com/static/MathShare/. It just stores the content in the URL, and is limited by URL size limits. In practice this means you have approximately one page of text.
I wrote about making this at https://davidlowryduda.com/mathshare/. I was trying out the LLM interaction that made it near the top of HN recently, and it worked very well.
I would be very interested in hearing the types/examples of moderation concerns that users posted. This has been a worry of mine whenever I accept UGC on a website, and it might be good for anyone who plans on hosting a similar service, a heads up on what to screen for.
I am very sad for the loss of this service and thank you for work you have had to go through to keep it online.
> Another alternative would be to encode the content of a MathB.in post and embed it into the URL, which could then be distributed with others. Upon visiting the URL, the application would read the encoded content from the URL, decode it, and render it.
I made something like this a couple of years ago; it was only 60-something lines long. Very simple and a bit of fun to play with: https://github.com/ea935/maths
That link brings me to a 404 page.
Why not ask for a volunteer to transfer site ownership to? It's not ideal: the volunteers have to be vetted, and the selected one may still end up malicious, or nobody may be selected. But it's better than shutting down the site without trying.
Anyone know in what jurisdiction these regulators are sending shut-down threats from?
UK
No surprise, sadly.
the UK is probably the most dystopian western country now
the UK always was the most censor-happy western country. Nobody remembers Ulysses, Lolita, Last Exit to Brooklyn? Or their British Board of Film Classification?
https://en.wikipedia.org/wiki/Censorship_in_the_United_Kingd... or https://en.wikipedia.org/wiki/List_of_books_banned_by_govern...