Truly any interesting thing ever created on the internet will be exploited for marketing until the end of time. Unfortunate that old redditors and other forum contributors were not able to capture the millions of dollars of value created from their work.
EDIT: I did see it’s at least referenced at the end.
Micropayments exist in apps, and yet that hasn't done anything to stop apps from having both micropayments while also being infested with ads.
It's time to dispense with the myth that micropayments were a missed opportunity for the web. Commercialization enshittified the web, and micropayments would have just accelerated that decline.
I love how forgiving the web is. These days you can get strictness with typescript and the like, which is great for businesses and work. But the forgiving nature of html and css and even javascript has contributed to so much adoption. Plus grew one of the most important things in an platform the ecosystem itself. Seeing rust slow down with piles of crates and fall into the same issues, js and npm has, perhaps its not a language problem but a ecosystem size problem. Larger user base === larger problems
If you really want pedantic strictness and perfection, native applications are the place to work. But it isnt always better.
And the web is fast, like really fast in rendering highly markedup text layouts. Just because everyone uses a frontend framework for “maintainability” doesn’t mean the engine is slow.
I do appreciate all of what you’ve said but the author of this piece is actually mistaken.
The browser isn’t actually processing the string “chucknorris” and forgiving incorrectly provided hex codes, which is a common misconception.
What actually happens within the rendering pipeline is that the full literal string (in this case, “chucknorris”) is parsed, and the browser attempts to render the tag in the colour of blood in hope of receiving the mercy that Chuck Norris doesn’t have.
As said, it’s a common misconception and I’m glad I could clear it up before he reads this.
This will keep His spirit alive, like an Old God who is fed by our unknowing adherence to traditions, hanging glass baubles on Christmas trees in a pale reflection of hanging the heads of our sacrificed enemies, yet still providing a drop of sustenance to the sleeping Gods.
I'm sorry, but you're missing the spirit of this. It's a post about the spirit of Chuck Norris as represented as a concept on the Internet rather than the actual person. Yes, the person may be reprehensible, however that is not what's being addressed here, rather it's the idea of the concept of Chuck Norris.
And that's the whole spirit of this post in the first place. There is a certain sort of beauty to being so forgiving. The fact that HTML rendering can deal with this level of inanity is frankly an awesome aspect of the web that has enabled loads of creativity by non-programmer types (speaking as a programmer here). I think sometimes we need to pull our heads out of our collective asses and allow for this sort of thing to permit beautiful things to happen.
Homophobia and bigotry and sexual abuse and child molestation and pedophilia are not "beautiful things". And those are the things that Chuck Norris himself, the real person, in the real world, represents, condones, and endorses, and that he actively uses his powerful platform and fame and fortune and reputation to promote, by his own choice and free will. Face the hard cold facts of reality and pull your own head out of your own ass.
I'm the kind of person who calls that stuff out when I see it, and doesn't let it pass. The actual spirit and concept of Chuck Norris is that he's a homophobic bigot who endorsed a pedophile for the US Senate. It's so sad that you're the kind of person who thinks it's important to call out people who are calling out those facts, and you feel so compelled to defend their actions and reputations, and try to prevent people from knowing about the truth.
It's sorry you're like that, and I don't understand what your motivation is for leaping to the defense of such a reprehensible man and his cruel intentional words and actions that purposefully discriminate and hurt other people. Who exactly are you trying to protect? Certainly not the people his actions harm. Did you work for Roy Moore's failed political campaign, or vote for him because Chuck Norris endorsed him, and I offended you or something?
You may believe I ruined your childhood by telling you the truth about Chuck Norris and Roy Moore, and you prefer believing and repeating only the lies and fantasies. But Roy Moore, from his powerful position as assistant district attorney, ruined a 14-year-old girl's childhood (and other children's too) by seducing and molesting her in the woods, and Chuck Norris knowingly endorsed that pedophile for the US Senate.
Have a nice day, but stop trying to cover up and suppress the truth about Chuck Norris, and the homophobic sexual abusing pedophile he publicly endorsed, Roy Moore. It's too bad the truth hurt your feelings and spoiled your good mood, but you should find better heros to worship and more deserving people to defend and more constructive things to care about than defending the reputations of Chuck Norris and Roy Moore, because it reflects on what kind of a person you really are.
In Sex Crimes and Other Cases, Roy Moore Often Sided With Defendants:
>Roy S. Moore, the cowboy-hat-wearing Republican running for the Senate in Alabama, had been known for years as an attention-grabbing judge who blocked same-sex marriages and insisted on displaying the Ten Commandments in his courthouse.
>But as women have accused Mr. Moore of sexual misconduct and assault in recent days, and as politicians in both parties have urged him to step aside, attention has focused on another part of his judicial record: his writings on rape and sexual abuse.
Senate candidate Roy Moore's accuser: I was a 14-year-old child
>A woman who says Alabama Senate candidate Roy Moore abused her when she was a 14-year-old girl has described the alleged encounter.
>Leigh Corfman told NBC News that Mr Moore, then a 32-year-old prosecutor, "seduced" her at his house in 1979. [...]
>Ms Corfman originally told the Washington Post how she was approached by Mr Moore outside a courthouse in Etowah County in 1979.
>She said she had been sitting with her mother on a bench awaiting a child custody hearing in her parents' divorce case.
>Her mother, the newspaper reported, was delighted when the assistant district attorney offered to sit with her daughter outside to spare her having to listen to the court proceedings.
>In the coming days, Mr Moore allegedly picked up Ms Corfman around the corner from her home, and drove her to his house in the woods where he sexually assaulted her, the newspaper claims.
>Ms Corfman told NBC's Today show: "Well I wouldn't exactly call it a date, I would call it a meet. At 14 I was not dating.
>"At 14 I was not able to make those kinds of choices. I met him around the corner from my house, my mother did not know.
>"And he took me to his home. After arriving at his home on the second occasion he basically laid out some blankets on the floor of his living room and proceeded to seduce me, I guess you would say."
>She alleged that he removed her clothing and stripped to his white underwear before molesting her and trying to get her to touch him.
>"At that point I pulled back and said that I was not comfortable and I got dressed and he took me home," Ms Corfman said.
>"But I was a 14-year-old child trying to play in an adult's world and he was 32 years old."
Roy Moore accuser: I got him banned from the mall.
Eight women have accused Moore of sexual misconduct:
>An Alabama woman who has accused Republican U.S. Senate candidate Roy Moore of sexually harassing her in the late 1970s said he was banned from the mall where she worked after she complained about his repeated, unwanted advances toward her.
>“I went to my manager and talked to him about it and asked him, basically, what could be done,” Becky Gray told ABC News late Wednesday night. “Later on, he…came back through my department and told me that [Moore] had been banned from the mall.”
The web solved distribution and that's why the web won. Click on a link see a page. Click on a link open an app. The alternative was downloading a .exe from questionable origin over an unreliable dial-up connection. App stores have become hugely successful since and if Microsoft had invented the mobile app model in the 90s (sandboxing and 1 click install process, like flatpak on linux) that model would probably have won instead of the web app. The web was incredibly buggy and horrifically slow for 2 decades, but there was no alternative so it won by default.
Exactly, no bank or payment system would ever offer services on the internet. How can a store operate in this environment! Never gonna happen. Impossible for mission critical tasks.
The person you’re replying to said “hard to consider”, they didn’t say it was “impossible for”. There is an infinity of difference.
Presumably banks and payments systems aren’t using web technologies (one would hope) to do the actual payments and transfers. They use them as an interface to other systems in other languages. And most of them tend to push you hard to use their apps, bank websites are often subpar.
> And the web is fast, like really fast in rendering highly markedup text layouts. Just because everyone uses a frontend framework for “maintainability” doesn’t mean the engine is slow.
Yes and no. Many many non trivial things becomes very slow if you stick naïve ways with larger amounts for visual elements. It bites hard, because even naïve ways of doing things require quite some time for complex UIs, and very often, the roadblock requires to redo most if not all of the work.
I don't love how forgiving the web is. We would all spend far less time debugging painful HTML/CSS/JS issues if the web was stricter. I think several billion man-hours would have been saved and human civilization probably more advanced. Also health-insurance costs would be reduced and web-developer lifespan likely increased thanks to less blood pressure.
(Don't think the Rust analogy really holds. npm is worse than Cargo. Of-course things could be better - a blessed standard extensions library: `stdx` with a quarterly update cycle and the crate nightmare will be solved)
This reminded me of one of my first web-dev projects as a teenager - the first one I showed my (recently passed) father, in which you were prompted to ask "Chuck Norris" a yes or no question with a text box, and then would present you a yes or no answer with a suitable photo of the man himself.
I did some rudimentary string parsing on submit, and if the question started with a word like "Where", "How", "Why", "Who" and a few other words that signified the question couldn't be answered with Yes or No, it would show an angry photo of his face with the caption "Your question has angered chuck!" - I think I also gave it a 1% chance to randomly roll that result regardless.
My dad absolutely loved that little project, and reminded me of how funny he found it even this year shortly before we lost him, almost 2 decades after I'd made it.
Sorry this wasn't incredibly related to your submission, but I wanted to share a happy memory you just brought back to me with it.
Well yes and no. I assure you, many many forums exist outside of Facebook, and mailing lists too.
And endless, classic, non-https sites still exist too.
They are just pages deep in Google's search results, especially with the entire first several pages of scrolling being their almost always wrong suggestions.
And it doesn't help that verbatim is completely broken now, and that Google aliases search terms, and drops search terms, even when using quotes.
A horrible design, and the result is all that goodness is harder to find.
> How is the score calculated? Oh man, this was harder than I thought.
Color is such a deceptively dangerous rabbit hole. A few months ago I entered a game jam intending to make a small pokemon-like where you collect colors rather than creatures. E.g. instead of fire>grass>water it would be red>green>blue, with randomly generated colors of any shade and about half a dozen 'types'. Trying to figure out the math of that nerd sniped me for a couple weeks, didn't even get started on the rest of the game before the jam ended.
I would love this with a limited, accurate color list. It would be a fun way to learn more obscure colors like chartreuse, ochre, and sienna. I'd particularly like to learn the colors that are likely to come up in literature (e.g., [0]). As it is, I got "Jaycey's Favorite" as one of the colors.
Hah I played twice and did not meaningfully inprove my score. Even after seeing the color I could not match it up close enough, and the first time I got pretty close to most colors.
> I've heard people quip that browsers should be less forgiving and enforce perfection. That allowing jank makes the web somehow 'bad'. I think a perfect web would be a boring web. I certainly wouldn't be here writing were it 'perfect'. It's about making the web work, no matter what we throw at it, and I wouldn't have it any other way.
It's probably less about "perfection" than precluding non-conformance to a standard from the beginning. The tale of imperfect beginnings to standards that haunt the world for decades repeats itself ad nauseam. In short, if one is clever enough to engineer a relatively future-proof standard, then one avoids (possibly substantial) wasted developer hours.
Note: This is NOT easy and things are sometimes obvious and unforgiving in hindsight.
I cant remember but I had a list of unsolvable problems that just ruled it out. I think in one instance I wanted to populate a form after it failed validation but couldn't correctly escape the value attributes.
I believe the biggest design flaw is that the data is the first document it loads. It doesn't work like that. The application should load first, then the ui logic, then the data and then you should get to load other data sets. Swap them, filter, join, merge etc You would get powerful applications with very little effort.
>I've heard people quip that browsers should be less forgiving and enforce perfection. That allowing jank makes the web somehow 'bad'.
Considering all the misery inflicted by computer crime that's enabled by the forgiving attitude, I 100% agree. And given the choice between "you can still visit the Space Jam website[0]" or "never worry about drive-by ransomware again", I'm pretty sure I'd be with the majority in choosing the latter. Security is a heavy price to pay for whimsy. Old-technology hobbyists could still run old web browsers in sandboxed VMs.
It's a false choice in any case. Web browsers can support backward compatibility and at the same time offer a strict mode that benefits correctness and performance. Some real opportunities were missed to clean up JS semantics when Modules got introduced, for instance.
> Considering all the misery inflicted by computer crime that's enabled by the forgiving attitude, I 100% agree.
Parsers should strictly adhere to a standard, if the standard says they should be "forgiving" and that "forgiving" is well defined then all parsers should act this way. Inconsistent behavior between implementations opens its own can of worms that may leave a system in an inconsistent or insecure state.
I don't see how making browsers less forgiving would reduce drive-by ransomware. I don't think most browser vulnerabilities are related to the web being forgiving.
A huge part of this complexity exists because of the forgiving attitude. With a strict attitude the spec could be much simpler, and that means web browsers could be much smaller. The most secure code is no code.
Vulnerabilities are related to the web being complex, and just this one quirk involves a parsing algorithm with 6 transformations of a user-provided value.
I think the choice is more "writing a browser is a weekend project" vs "writing a browser is years of toil and you'll need a local copy of the internet to test it comprehensively"
> The web is built on this foundation of resilience, both in technology and ethos. It's what allows a website from 1996 to still render in a modern browser. It's what lets a page load even when half the CSS is invalid. It's what makes it magic.
The concept of graceful degradation in web feature support is dead now. Presently most of the web fails to render at all if you don’t execute a giant javascript blob in full that is responsible for putting the words on the page. It’s quite sad.
The included codepen is unreadable on mobile until you remove -webkit-text-stroke from them CSS.
At the initial value of 0.5rem it gives a quite cool effect, as if the text has been badly redacted with a white highlighter. I initially thought that was intentional.
Works fine on iOS. It basically adds a white scribbly outline to the text which makes it easy to read the black text regardless of the background color.
It actually looks fantastic for such a low effort effect.
No idea, but it used to cause confusion that on a British keyboard Shift-3 produced the £ symbol, but a US keyboard produced the # symbol - while in the respective countries they were both referred to as "pound"
Perhaps some British influence somewhere? Where were # tags first popularised in the public imagination? Twitter I assume? Or something else?
---
Edit to add side-observation:
"Octothorpe" I quite like, but I've never heard it spoken and mostly only read it in quite nerdy internetty contexts. Even amongst nerds I don't think we really use it that much.
Meanwhile @ is occasionally referred to as "commercial at" and I've seen one or two uses of "ampersat" for it. Again, I don't think I've ever heard anyone use those in speech even though they might actualy be useful for disambiguating the symbol from the word (but I think people say "at sign" when they want to be specific).
---
Another passing thought:
I remember being exceedingly irritated to get a Mac with a UK keyboard in the early 2000s and discovering that the # symbol was not shifted to another key (as on a PC keyboard) but had to be accessed via the Alt-gr key combo (possibly with shift thrown into the mix as well?) ... as a recovering C++ developer at that time it was super annoying.
Nowadays I have a Swedish keyboard and while # is fine, I have similar pain with {} and [] ... and now I think of it, what is the proper name for what I exclusively call "curly braces" ?
Some people called it the hash symbol, and then they used it to mark tags (as in categories or keywords). So technically, the symbol is not the hashtag but the entire #tag construct is.
Perhaps influence from programmers, who would know the real "pound sign" is either ℔ or £, depending what type of pound you prefer, and the number sign is №.
Yeah, it’s an interesting one. I’ve always known it as the hash symbol from back in the 80s when I got my first microcomputer. This was in the UK but I don’t think hash is UK-specific terminology: “shebang” (“hash bang” for #!) has been common parlance amongst the UNIX-crowd since I don’t know when.
I know pound is incredibly common in the US, but in the UK it’s never referred to that way because here pound means £.
“Octothorpe” is entirely new to me though, and I guess is quite context specific?
Yeah, I don't think I even heard "pound" for # until I went online and started getting exposed a lot more to US stuff (also harking from the 80s UK micro world).
I feel like "octothorpe" might have been referenced in the jargon file or similar texts?
> CSS has its own set of fascinating peculiarities when it comes to handling invalid colour values. Most modern browsers will clamp values rather than reject them outright -– throw rgb(300, -50, 1000) at a browser and it won't fail; it'll helpfully transform it into rgb(255, 0, 255).
Yeah it addresses that towards the end of the article. CSS has different rules for parsing badly formatted colors. Unsurprising given that CSS colors can be in several different formats (hex, RGB, RGBA, HCL etc.).
could not help my self: Chuck Norris doesn’t see red; red sees Chuck Norris. Sorry another one, The color red was invented to match Chuck Norris's intensity.
Highly unlikely. The parsers are written in efficient languages, and nobody is setting <font> color attributes in a loop. Also, this is not how it is done today, and old websites are usually a lot faster than modern ones. I would even argue that the parsing CSS's color syntax is way more expensive than this comparatively simple step-by-step algorithm.
> and nobody is setting <font> color attributes in a loop.
You would be surprised! For instance, MotionMark, one of the main web benchmarks, keeps mutating CSS colors directly on (thousands of) elements to animate them, every frame.
Not really. The color attribute isn’t commonly used these days - it’s literally legacy support - and, even on pages where it is used, it’s going to be a tiny portion of the overall markup.
Also worth bearing in mind that parsing markup is fast, and certainly not the reason web pages can feel slow. That’s much more to do with heavy assets, poorly constructed JavaScript, long-running web service calls, excessive DOM manipulation and re-rendering, bandwidth constraints, etc. Parsing the markup on its own isn’t going to be the root cause of a meaningful fraction of web performance issues.
The problem there is that there's a lot of sanity checking happening in browsers; I hope there will be a new technology where things like html, css and JS are all precompiled / pre-verified so the page is executed as-is with minimal sanity checking.
There was a post yesterday about Python's random() vs randint; the former just spits out a random number, the latter does a list of sanity checks on the arguments passed, whose cost adds up in many invocations. But that's a runtime sanity check, if that can be done beforehand with e.g. a stricter type system + a "let it crash / fail" attitude at runtime, all that sanity checking overhead would be gone.
I mean this script / at-runtime checking is fine during development if it means shorter iterations and whatnot, but in production it shouldn't happen.
I work on CSS parser performance, and error handling isn't really a big pain point; if you removed all error handling overhead, you would not be likely to notice any real performance increase in your web page loading. Most of the time, you just hit the happy path anyway, and the error checks from that is an easily-predicted branch.
A precompiled format (i.e., binary) _may_ help (I don't think anyone has really considered it), but “pre-verified” means it would come down to who you trust to do that verification, so it's a hard sell.
This article appears to be recycling content from a 13 year old top stack overflow question:
https://stackoverflow.com/questions/8318911/why-does-html-th...
Truly any interesting thing ever created on the internet will be exploited for marketing until the end of time. Unfortunate that old redditors and other forum contributors were not able to capture the millions of dollars of value created from their work.
EDIT: I did see it’s at least referenced at the end.
Marketing? I don't see anything for sale or even any ads on the blog page.
A ton of blog content is still "marketing" content. Even if it doesn't directly display ads. It's "SEO".
> Unfortunate that old redditors and other forum contributors were not able to capture the millions of dollars of value created from their work.
It's all down to a lack of viable micropayment infrastructure. See also: ad tech.
I think a lot of forum posters wouldn't want to charge micro payments for it though?
[dead]
Micropayments exist in apps, and yet that hasn't done anything to stop apps from having both micropayments while also being infested with ads.
It's time to dispense with the myth that micropayments were a missed opportunity for the web. Commercialization enshittified the web, and micropayments would have just accelerated that decline.
My favourite outcome is that 'chocolate' is reduced to '#c0c0a0'
'shiny' gives pitch black, and 'obscure' gives blue sky color.
I think the point was that chocolate -> cocoa :)
Yes, that's nearly semantic f(x)=x, I added that we could also find an x such as f(x)=1/x and f(1/x)=x.
I love how forgiving the web is. These days you can get strictness with typescript and the like, which is great for businesses and work. But the forgiving nature of html and css and even javascript has contributed to so much adoption. Plus grew one of the most important things in an platform the ecosystem itself. Seeing rust slow down with piles of crates and fall into the same issues, js and npm has, perhaps its not a language problem but a ecosystem size problem. Larger user base === larger problems
If you really want pedantic strictness and perfection, native applications are the place to work. But it isnt always better.
And the web is fast, like really fast in rendering highly markedup text layouts. Just because everyone uses a frontend framework for “maintainability” doesn’t mean the engine is slow.
I do appreciate all of what you’ve said but the author of this piece is actually mistaken.
The browser isn’t actually processing the string “chucknorris” and forgiving incorrectly provided hex codes, which is a common misconception.
What actually happens within the rendering pipeline is that the full literal string (in this case, “chucknorris”) is parsed, and the browser attempts to render the tag in the colour of blood in hope of receiving the mercy that Chuck Norris doesn’t have.
As said, it’s a common misconception and I’m glad I could clear it up before he reads this.
And, inevitably, the next generation [0] comes along and use the colour without having any idea of who Chuck Norris even was.
[0] https://news.ycombinator.com/item?id=42461264
This will keep His spirit alive, like an Old God who is fed by our unknowing adherence to traditions, hanging glass baubles on Christmas trees in a pale reflection of hanging the heads of our sacrificed enemies, yet still providing a drop of sustenance to the sleeping Gods.
[flagged]
I'm sorry, but you're missing the spirit of this. It's a post about the spirit of Chuck Norris as represented as a concept on the Internet rather than the actual person. Yes, the person may be reprehensible, however that is not what's being addressed here, rather it's the idea of the concept of Chuck Norris.
And that's the whole spirit of this post in the first place. There is a certain sort of beauty to being so forgiving. The fact that HTML rendering can deal with this level of inanity is frankly an awesome aspect of the web that has enabled loads of creativity by non-programmer types (speaking as a programmer here). I think sometimes we need to pull our heads out of our collective asses and allow for this sort of thing to permit beautiful things to happen.
Homophobia and bigotry and sexual abuse and child molestation and pedophilia are not "beautiful things". And those are the things that Chuck Norris himself, the real person, in the real world, represents, condones, and endorses, and that he actively uses his powerful platform and fame and fortune and reputation to promote, by his own choice and free will. Face the hard cold facts of reality and pull your own head out of your own ass.
I'm the kind of person who calls that stuff out when I see it, and doesn't let it pass. The actual spirit and concept of Chuck Norris is that he's a homophobic bigot who endorsed a pedophile for the US Senate. It's so sad that you're the kind of person who thinks it's important to call out people who are calling out those facts, and you feel so compelled to defend their actions and reputations, and try to prevent people from knowing about the truth.
It's sorry you're like that, and I don't understand what your motivation is for leaping to the defense of such a reprehensible man and his cruel intentional words and actions that purposefully discriminate and hurt other people. Who exactly are you trying to protect? Certainly not the people his actions harm. Did you work for Roy Moore's failed political campaign, or vote for him because Chuck Norris endorsed him, and I offended you or something?
You may believe I ruined your childhood by telling you the truth about Chuck Norris and Roy Moore, and you prefer believing and repeating only the lies and fantasies. But Roy Moore, from his powerful position as assistant district attorney, ruined a 14-year-old girl's childhood (and other children's too) by seducing and molesting her in the woods, and Chuck Norris knowingly endorsed that pedophile for the US Senate.
Have a nice day, but stop trying to cover up and suppress the truth about Chuck Norris, and the homophobic sexual abusing pedophile he publicly endorsed, Roy Moore. It's too bad the truth hurt your feelings and spoiled your good mood, but you should find better heros to worship and more deserving people to defend and more constructive things to care about than defending the reputations of Chuck Norris and Roy Moore, because it reflects on what kind of a person you really are.
In Sex Crimes and Other Cases, Roy Moore Often Sided With Defendants:
https://www.nytimes.com/2017/11/17/us/roy-moore-judicial-rec...
>Roy S. Moore, the cowboy-hat-wearing Republican running for the Senate in Alabama, had been known for years as an attention-grabbing judge who blocked same-sex marriages and insisted on displaying the Ten Commandments in his courthouse.
>But as women have accused Mr. Moore of sexual misconduct and assault in recent days, and as politicians in both parties have urged him to step aside, attention has focused on another part of his judicial record: his writings on rape and sexual abuse.
Senate candidate Roy Moore's accuser: I was a 14-year-old child
https://www.bbc.com/news/world-us-canada-42054780
>A woman who says Alabama Senate candidate Roy Moore abused her when she was a 14-year-old girl has described the alleged encounter.
>Leigh Corfman told NBC News that Mr Moore, then a 32-year-old prosecutor, "seduced" her at his house in 1979. [...]
>Ms Corfman originally told the Washington Post how she was approached by Mr Moore outside a courthouse in Etowah County in 1979.
>She said she had been sitting with her mother on a bench awaiting a child custody hearing in her parents' divorce case.
>Her mother, the newspaper reported, was delighted when the assistant district attorney offered to sit with her daughter outside to spare her having to listen to the court proceedings.
>In the coming days, Mr Moore allegedly picked up Ms Corfman around the corner from her home, and drove her to his house in the woods where he sexually assaulted her, the newspaper claims.
>Ms Corfman told NBC's Today show: "Well I wouldn't exactly call it a date, I would call it a meet. At 14 I was not dating.
>"At 14 I was not able to make those kinds of choices. I met him around the corner from my house, my mother did not know.
>"And he took me to his home. After arriving at his home on the second occasion he basically laid out some blankets on the floor of his living room and proceeded to seduce me, I guess you would say."
>She alleged that he removed her clothing and stripped to his white underwear before molesting her and trying to get her to touch him.
>"At that point I pulled back and said that I was not comfortable and I got dressed and he took me home," Ms Corfman said.
>"But I was a 14-year-old child trying to play in an adult's world and he was 32 years old."
Roy Moore accuser: I got him banned from the mall. Eight women have accused Moore of sexual misconduct:
https://abcnews.go.com/Politics/roy-moore-accuser-banned-mal...
>An Alabama woman who has accused Republican U.S. Senate candidate Roy Moore of sexually harassing her in the late 1970s said he was banned from the mall where she worked after she complained about his repeated, unwanted advances toward her.
>“I went to my manager and talked to him about it and asked him, basically, what could be done,” Becky Gray told ABC News late Wednesday night. “Later on, he…came back through my department and told me that [Moore] had been banned from the mall.”
The web solved distribution and that's why the web won. Click on a link see a page. Click on a link open an app. The alternative was downloading a .exe from questionable origin over an unreliable dial-up connection. App stores have become hugely successful since and if Microsoft had invented the mobile app model in the 90s (sandboxing and 1 click install process, like flatpak on linux) that model would probably have won instead of the web app. The web was incredibly buggy and horrifically slow for 2 decades, but there was no alternative so it won by default.
> I love how forgiving the web is
I have completely opposite opinion. This "forgiveness" comes at a cost:
- unexpected behavior instead of an early crash
- hard to treat the platform seriously for mission critical tasks
- makes it common to have many ways to solve the same problem
Exactly, no bank or payment system would ever offer services on the internet. How can a store operate in this environment! Never gonna happen. Impossible for mission critical tasks.
The person you’re replying to said “hard to consider”, they didn’t say it was “impossible for”. There is an infinity of difference.
Presumably banks and payments systems aren’t using web technologies (one would hope) to do the actual payments and transfers. They use them as an interface to other systems in other languages. And most of them tend to push you hard to use their apps, bank websites are often subpar.
Pigs can fly, given enough thrust.
> And the web is fast, like really fast in rendering highly markedup text layouts. Just because everyone uses a frontend framework for “maintainability” doesn’t mean the engine is slow.
Yes and no. Many many non trivial things becomes very slow if you stick naïve ways with larger amounts for visual elements. It bites hard, because even naïve ways of doing things require quite some time for complex UIs, and very often, the roadblock requires to redo most if not all of the work.
I don't love how forgiving the web is. We would all spend far less time debugging painful HTML/CSS/JS issues if the web was stricter. I think several billion man-hours would have been saved and human civilization probably more advanced. Also health-insurance costs would be reduced and web-developer lifespan likely increased thanks to less blood pressure.
(Don't think the Rust analogy really holds. npm is worse than Cargo. Of-course things could be better - a blessed standard extensions library: `stdx` with a quarterly update cycle and the crate nightmare will be solved)
This was a good read, but the author is mistaken: chucknorris isn't rendered as red, red is rendered as chucknorris.
This reminded me of one of my first web-dev projects as a teenager - the first one I showed my (recently passed) father, in which you were prompted to ask "Chuck Norris" a yes or no question with a text box, and then would present you a yes or no answer with a suitable photo of the man himself.
I did some rudimentary string parsing on submit, and if the question started with a word like "Where", "How", "Why", "Who" and a few other words that signified the question couldn't be answered with Yes or No, it would show an angry photo of his face with the caption "Your question has angered chuck!" - I think I also gave it a 1% chance to randomly roll that result regardless.
My dad absolutely loved that little project, and reminded me of how funny he found it even this year shortly before we lost him, almost 2 decades after I'd made it.
Sorry this wasn't incredibly related to your submission, but I wanted to share a happy memory you just brought back to me with it.
This reminds me of the early web. So many fun websites like this.
Now it's just silos controlled by megacorps, trying to outdo each other in competition for attention.
Well yes and no. I assure you, many many forums exist outside of Facebook, and mailing lists too.
And endless, classic, non-https sites still exist too.
They are just pages deep in Google's search results, especially with the entire first several pages of scrolling being their almost always wrong suggestions.
And it doesn't help that verbatim is completely broken now, and that Google aliases search terms, and drops search terms, even when using quotes.
A horrible design, and the result is all that goodness is harder to find.
It's not siloed, it's Google only
> Chuck Norris isn't a colour.
But the browser is too afraid to point that out.
(Sorry, I'll show myself out)
This means we can treat "o" as "0" (zero) because it gets automatically substituted like this anyway. E.g.:
actually becomes: Same with baobab (#ba0bab), decode (#dec0de), etc.Interesting. There are also differences between HTML and CSS:
becomes a sort of red in HTML, but a mostly transparent pink in CSS (for 8 digits hex numbers, the last two encode the alpha value).Color names are a strange thing, it is like given names to numbers. I made a game out of it [0].
[0] https://colorguesser.com/
> How is the score calculated? Oh man, this was harder than I thought.
Color is such a deceptively dangerous rabbit hole. A few months ago I entered a game jam intending to make a small pokemon-like where you collect colors rather than creatures. E.g. instead of fire>grass>water it would be red>green>blue, with randomly generated colors of any shade and about half a dozen 'types'. Trying to figure out the math of that nerd sniped me for a couple weeks, didn't even get started on the rest of the game before the jam ended.
I would love this with a limited, accurate color list. It would be a fun way to learn more obscure colors like chartreuse, ochre, and sienna. I'd particularly like to learn the colors that are likely to come up in literature (e.g., [0]). As it is, I got "Jaycey's Favorite" as one of the colors.
0. https://www.vocabulary.com/lists/141957
that was fun! It would be nice to show my colour next to the result to compare a little more.
Hah I played twice and did not meaningfully inprove my score. Even after seeing the color I could not match it up close enough, and the first time I got pretty close to most colors.
> I've heard people quip that browsers should be less forgiving and enforce perfection. That allowing jank makes the web somehow 'bad'. I think a perfect web would be a boring web. I certainly wouldn't be here writing were it 'perfect'. It's about making the web work, no matter what we throw at it, and I wouldn't have it any other way.
It's probably less about "perfection" than precluding non-conformance to a standard from the beginning. The tale of imperfect beginnings to standards that haunt the world for decades repeats itself ad nauseam. In short, if one is clever enough to engineer a relatively future-proof standard, then one avoids (possibly substantial) wasted developer hours.
Note: This is NOT easy and things are sometimes obvious and unforgiving in hindsight.
XHTML tried this approach, but it failed for this and other reasons.
Or maybe it failed entirely for the "other reasons", and this particular part of its design was actually right.
Well at the time, I do know that the vast majority of complaints I heard from colleagues was that it breaks too easily and wouldn't render after that.
I cant remember but I had a list of unsolvable problems that just ruled it out. I think in one instance I wanted to populate a form after it failed validation but couldn't correctly escape the value attributes.
I believe the biggest design flaw is that the data is the first document it loads. It doesn't work like that. The application should load first, then the ui logic, then the data and then you should get to load other data sets. Swap them, filter, join, merge etc You would get powerful applications with very little effort.
This was an enjoyable read
https://www.reddit.com/r/AskHistorians/comments/10vfgiq/what...
>I've heard people quip that browsers should be less forgiving and enforce perfection. That allowing jank makes the web somehow 'bad'.
Considering all the misery inflicted by computer crime that's enabled by the forgiving attitude, I 100% agree. And given the choice between "you can still visit the Space Jam website[0]" or "never worry about drive-by ransomware again", I'm pretty sure I'd be with the majority in choosing the latter. Security is a heavy price to pay for whimsy. Old-technology hobbyists could still run old web browsers in sandboxed VMs.
[0] https://www.spacejam.com/1996/
It's a false choice in any case. Web browsers can support backward compatibility and at the same time offer a strict mode that benefits correctness and performance. Some real opportunities were missed to clean up JS semantics when Modules got introduced, for instance.
And they did with XHTML, but unfortunately it didn't catch on.
> Considering all the misery inflicted by computer crime that's enabled by the forgiving attitude, I 100% agree.
Parsers should strictly adhere to a standard, if the standard says they should be "forgiving" and that "forgiving" is well defined then all parsers should act this way. Inconsistent behavior between implementations opens its own can of worms that may leave a system in an inconsistent or insecure state.
I don't see how making browsers less forgiving would reduce drive-by ransomware. I don't think most browser vulnerabilities are related to the web being forgiving.
Have you seen how big the HTML spec is?
https://html.spec.whatwg.org/
A huge part of this complexity exists because of the forgiving attitude. With a strict attitude the spec could be much simpler, and that means web browsers could be much smaller. The most secure code is no code.
Vulnerabilities are related to the web being complex, and just this one quirk involves a parsing algorithm with 6 transformations of a user-provided value.
I think the choice is more "writing a browser is a weekend project" vs "writing a browser is years of toil and you'll need a local copy of the internet to test it comprehensively"
Unintended rubrication, perhaps? – https://gwern.net/red
Thank you. This happens to be invaluable to me!
> After all, in a perfect web, "chucknorris" would just be another error message
Chuck Norris has only ever received one error message. He stared the computer down until it apologized and fixed the problem.
> The web is built on this foundation of resilience, both in technology and ethos. It's what allows a website from 1996 to still render in a modern browser. It's what lets a page load even when half the CSS is invalid. It's what makes it magic.
The concept of graceful degradation in web feature support is dead now. Presently most of the web fails to render at all if you don’t execute a giant javascript blob in full that is responsible for putting the words on the page. It’s quite sad.
The parsing outline in the article omits that there are also 140 hard coded color names in html: https://htmlcolorcodes.com/color-names/
The included codepen is unreadable on mobile until you remove -webkit-text-stroke from them CSS.
At the initial value of 0.5rem it gives a quite cool effect, as if the text has been badly redacted with a white highlighter. I initially thought that was intentional.
Works fine on Android too, with Firefox.
Works fine on iOS. It basically adds a white scribbly outline to the text which makes it easy to read the black text regardless of the background color.
It actually looks fantastic for such a low effort effect.
"smurf" gives a nice #0000f0, ain't that funny?
Back the in the days I used a lot of d3.js for making visualizations.
Till today my favorite color for background, secondary text is still d3d3d3.
> If an octothorpe (#) is located at the start of the value, it's removed.
An “octothorpe”! I never knew it was called that (and, apparently, neither does autocorrect). What a glorious name!
I wonder why it ended up being "hash-tag" on social media when it's quite common for North Americans to refer to the symbol as the "pound symbol"?
No idea, but it used to cause confusion that on a British keyboard Shift-3 produced the £ symbol, but a US keyboard produced the # symbol - while in the respective countries they were both referred to as "pound"
Perhaps some British influence somewhere? Where were # tags first popularised in the public imagination? Twitter I assume? Or something else?
---
Edit to add side-observation:
"Octothorpe" I quite like, but I've never heard it spoken and mostly only read it in quite nerdy internetty contexts. Even amongst nerds I don't think we really use it that much.
Meanwhile @ is occasionally referred to as "commercial at" and I've seen one or two uses of "ampersat" for it. Again, I don't think I've ever heard anyone use those in speech even though they might actualy be useful for disambiguating the symbol from the word (but I think people say "at sign" when they want to be specific).
---
Another passing thought:
I remember being exceedingly irritated to get a Mac with a UK keyboard in the early 2000s and discovering that the # symbol was not shifted to another key (as on a PC keyboard) but had to be accessed via the Alt-gr key combo (possibly with shift thrown into the mix as well?) ... as a recovering C++ developer at that time it was super annoying.
Nowadays I have a Swedish keyboard and while # is fine, I have similar pain with {} and [] ... and now I think of it, what is the proper name for what I exclusively call "curly braces" ?
> I have similar pain with {} and []
in grade school I believe we were told those were "braces" and "brackets" respectively
> what is the proper name for what I exclusively call "curly braces" ?
Sideways moustaches.
Some people called it the hash symbol, and then they used it to mark tags (as in categories or keywords). So technically, the symbol is not the hashtag but the entire #tag construct is.
Perhaps influence from programmers, who would know the real "pound sign" is either ℔ or £, depending what type of pound you prefer, and the number sign is №.
£ is a stylised L with a bar, incidentally.
Yeah, it’s an interesting one. I’ve always known it as the hash symbol from back in the 80s when I got my first microcomputer. This was in the UK but I don’t think hash is UK-specific terminology: “shebang” (“hash bang” for #!) has been common parlance amongst the UNIX-crowd since I don’t know when.
I know pound is incredibly common in the US, but in the UK it’s never referred to that way because here pound means £.
“Octothorpe” is entirely new to me though, and I guess is quite context specific?
> “shebang” (“hash bang” for #!)
I think the etymology of "shebang" isn't clear, some sources say it may come from "sharp bang" or "shell bang".
http://catb.org/jargon/html/S/shebang.html
Yeah, I don't think I even heard "pound" for # until I went online and started getting exposed a lot more to US stuff (also harking from the 80s UK micro world).
I feel like "octothorpe" might have been referenced in the jargon file or similar texts?
I checked, and it gets a passing mention (along with a snipe about annoying Britons!) in Jargon File and Hackers Dictionary.
A while ago I made this app that lets you pick a color and find an English word that gives a similar color: https://g-dv.gitlab.io/color-namer
Feels like being back in 2012 all over again reading this
The moment I've been waiting for: for a topic on HN to devolve into Barrens chat. This is going to be a great day. Zug zug.
I’d prefer if the discussion was about how homophobic chuck Norris is.
I'm surprised this is well defined behavior.
It certainly is now (HTML5 retroactively standardized a lot of quirks), but it wasn't always well-defined: https://scrappy-do.blogspot.com/2004/08/little-rant-about-mi...
I already saw the Stack Overflow reference cited in the article (https://stackoverflow.com/questions/8318911) many years ago, so I wasn't surprised.
But I guess I can still be disappointed. Postel's Law has its limits.
Postel or Hyrum?
Postel. This is about me thinking it was a bad idea to make it work in the first place.
But reason people are making it compatible with old versions of browsers is part of Hyrum's law.
Excellent!
"green" is green, but my favorite is "peace" actually.
"red" is blue, and "blue" is red!
<font color="chuck">Also just a name "Chuck" isn't a colour.</font>
This doesn't work with CSS, right? Only the color attribute of HTML.
From the article:
> CSS has its own set of fascinating peculiarities when it comes to handling invalid colour values. Most modern browsers will clamp values rather than reject them outright -– throw rgb(300, -50, 1000) at a browser and it won't fail; it'll helpfully transform it into rgb(255, 0, 255).
Yeah it addresses that towards the end of the article. CSS has different rules for parsing badly formatted colors. Unsurprising given that CSS colors can be in several different formats (hex, RGB, RGBA, HCL etc.).
could not help my self: Chuck Norris doesn’t see red; red sees Chuck Norris. Sorry another one, The color red was invented to match Chuck Norris's intensity.
It’s amusing that some of the other (English) word colors match what one would consider suitable or kinda close to those words.
Has anyone created a more comprehensive list of such (unexpected) color words for English and other languages?This is an old favorite of mine: http://bada55.io/
> Get out of here, lt IE9 user! http://bada55.io/img/old-ie-fist.png
Rude.
That was amazing.
well, there's those plus a few more in the article
you can use your name as a semi-unique color to you
mine is crap beige
[dead]
[flagged]
[dead]
[dead]
[flagged]
I can only think of one thing. The computational overhead of this parsing insanity.
With the size of modern web pages and the scale. This must amount to a very significant cost.
Highly unlikely. The parsers are written in efficient languages, and nobody is setting <font> color attributes in a loop. Also, this is not how it is done today, and old websites are usually a lot faster than modern ones. I would even argue that the parsing CSS's color syntax is way more expensive than this comparatively simple step-by-step algorithm.
> and nobody is setting <font> color attributes in a loop.
You would be surprised! For instance, MotionMark, one of the main web benchmarks, keeps mutating CSS colors directly on (thousands of) elements to animate them, every frame.
Not really. The color attribute isn’t commonly used these days - it’s literally legacy support - and, even on pages where it is used, it’s going to be a tiny portion of the overall markup.
Also worth bearing in mind that parsing markup is fast, and certainly not the reason web pages can feel slow. That’s much more to do with heavy assets, poorly constructed JavaScript, long-running web service calls, excessive DOM manipulation and re-rendering, bandwidth constraints, etc. Parsing the markup on its own isn’t going to be the root cause of a meaningful fraction of web performance issues.
The problem there is that there's a lot of sanity checking happening in browsers; I hope there will be a new technology where things like html, css and JS are all precompiled / pre-verified so the page is executed as-is with minimal sanity checking.
There was a post yesterday about Python's random() vs randint; the former just spits out a random number, the latter does a list of sanity checks on the arguments passed, whose cost adds up in many invocations. But that's a runtime sanity check, if that can be done beforehand with e.g. a stricter type system + a "let it crash / fail" attitude at runtime, all that sanity checking overhead would be gone.
I mean this script / at-runtime checking is fine during development if it means shorter iterations and whatnot, but in production it shouldn't happen.
Hi,
I work on CSS parser performance, and error handling isn't really a big pain point; if you removed all error handling overhead, you would not be likely to notice any real performance increase in your web page loading. Most of the time, you just hit the happy path anyway, and the error checks from that is an easily-predicted branch.
A precompiled format (i.e., binary) _may_ help (I don't think anyone has really considered it), but “pre-verified” means it would come down to who you trust to do that verification, so it's a hard sell.