friggeri 5 years ago

I'm very uncomfortable with the current trend of trying to get big tech to self police. They'll never get it right because of moving goal posts and moreover I don't want a private company to have that power in shaping social discourse.

If we as a society deem that type of content to be improper and that it should not be allowed on such platforms, then let's make that explicit from a legal perspective and regulate them.

  • Macross8299 5 years ago

    >let's make that explicit from a legal perspective and regulate them.

    Most people would also be pretty uncomfortable with the government vicariously deciding what they can say through Facebook. I'm sure it would cut both ways, as well. Not a stretch to see a president using an "anti-fake news" provision to stop damaging news about a scandal spread about him on social media.

  • daenz 5 years ago

    >that type of content

    Therein lies the problem. The video was slowed down and the audio slightly distorted. I don't see a set of metrics that can capture that "type" of alteration that doesn't also capture tons of content. Unless, of course, you go down the route of saying "it's illegal to post any altered video or image of any politician."

    • roenxi 5 years ago

      > Therein lies the problem. The video was slowed down and the audio slightly distorted.

      It isn't unusual to misrepresent a politician's message. It certainly isn't unusual for partisans to refuse to fairly represent their opponent's central point - either through a sense of mischief, because they just don't understand it or a genuine belief that there are more important issues at stake.

      This video is clearly a low blow, and deceptive, but it isn't at all obvious that it is different from lying about something that Nancy Pelosi said. 'Solving' that in any way will likely cause some spectacular partisan fireworks when a difficult case comes by. Whoever gets control of the body that gets to choose what is fact and what is fiction is going to have disproportionate power.

    • raxxorrax 5 years ago

      That standard would also not be met by traditional outlets. Cutting videos to underline the message of a text is pretty common. This is not a new problem.

    • devoply 5 years ago

      I don't see any such law passing first amendment challenge that's why the modern censorship for the right reason supporting centrist is down relegating it to private corporations that are not beholden to the US constitution.

      Ultimately the fact is that freedom of speech does not actually work if people have actual freedom of speech. It only works under certain circumstances where most of the media toes the line with the political establishment and is relatively controlled.

      • raxxorrax 5 years ago

        I disagree profoundly. Free speech only "works" if it isn't free? Would need additional arguments for that...

    • sgt101 5 years ago

      What would be the problem of making it illegal to alter video or images of politicians?

      • cr1895 5 years ago

        It plainly violates the 1st amendment.

        • AlexTWithBeard 5 years ago

          This and also sometimes a video has to be edited to make it watchable: remove the pauses, cut long sentences a bit...

      • oconnor663 5 years ago

        If a politician gives a 20 minute speech, and I edit together a 1 minute summary of it, is that "altered"? Does it matter whether I chose the clips to make the politician look foolish? Who gets to decide what counts as foolish?

      • mc32 5 years ago

        Throw parody and mashups out the window?

        Ban SNL, ban the daily show?

  • Yoric 5 years ago

    Well, many believe that "big tech" (social networks and search engine, I'd say) have essentially replaced traditional news networks. I don't have a clear opinion on this yet, but if that's the case, it makes sense to regulate "big tech" in the same manner as traditional news.

    • jpollock 5 years ago

      In the us, tv news is regulated because it is using public frequencies.

      Newspapers publish lies and propaganda all the time. Ref any supermarket checkout magazine.

      • qrbLPHiKpiux 5 years ago

        Social media takes anyone’s message right to the bottom - the end user - with no editorial gatekeeper.

        Today’s problems with tech aren’t tech. They’re all human problems. People forgot and don’t know how to sort through bullshit. If people really rely on Facebook to form opinions, then go vote, America, we have a problem.

        • raxxorrax 5 years ago

          An editor is certainly neither required nor wanted. If you like that, then please read established newspapers and let people have their fun.

          • codyb 5 years ago

            I feel like “having their fun” is a bit of a disingenuous means of describing discourse touching billions of people which has resulted in several awful real world situations such as measles outbreaks, discrimination, negative self worth, and even murder around the globe.

            • raxxorrax 5 years ago

              I don't really see any relation that would conclude lacking content control as the root cause of these randomly selected issues.

              • codyb 5 years ago

                Are they randomly selected if they're all issues for which a root cause can be traced to ingesting Facebook content?

    • amelius 5 years ago

      Yes, and big tech should similarly be regulated like traditional telecommunications, where subscribers of telco A can call subscribers of telco B without problems (unlike most chat services).

      • zaroth 5 years ago

        This seems like a terrible idea which would merely entrench monopolies further.

        Phone number interoperability was to establish an unbranded phone network that could be serviced by any telco. Making every service deliver messages to and from Facebook would be an absolute nightmare.

        • MereInterest 5 years ago

          How would requiring use of an open API entrench a monopoly? If I could send messages from IRC to Facebook, that would be one less reason to have a Facebook account.

        • mises 5 years ago

          Phone lines are very expensive, and there is limited space. In other words, phone interoperability was regulated into place because no one can have five telephones. Many people have five messaging apps. Its inconvenient, but not like a phone.

        • amelius 5 years ago

          Why would every message need to end up on Facebook?

          Perhaps you should think of it more as e.g. e-mail.

          • gingabriska 5 years ago

            Economies of scale. Whoever can do it cheaper and at scale gets the job.

            • username444 5 years ago

              Perspective is important. Messaging is already so inexpensive as to be free, it's time to move towards the other end of the pendulum.

              We need privacy and accountability more than the ability to further reduce the cost of a message from $0.0001 to $0.00005

      • Yoric 5 years ago

        What do you mean?

        • fragmede 5 years ago

          I can't message my friends on Snapchat from Instagram, but I can call my friends on T-Mobile and ATT from Verizon. Phone companies shenanigans go back a long way, so there are more than a few lawsuits involved, in order to get to our current state for calling.

          • mises 5 years ago

            That's a nonsensical comparison. Phones all have one purpose and mechanism: voice calling. Instagram is based around a public photo-blog. Snapchat is ephemeral photo-mesaging. You would be requiring social media sites to replicate each other's functionality. If I want to send a message to a twitter user, can it be longer than 140 characters? What if I am john.doe one one platform but you have the same name on another? Who keeps it?

            No, telcos were interoperable by fiat because there can only be so many phone lines.

            • mindslight 5 years ago

              It's quite amazing how the purported "free market" viewpoint has an unwavering faith in companies for getting things right, except for when imagining having to comply with any sort of mandate.

              A good faith interpretation of interoperability is that any service simply has to publish an equivalent API that allows interacting with "its" users in the service's own data model/paradigm. The market of competitors will then happily sort out the details of how to smooth the gaps.

              • mises 5 years ago

                > "free market"

                Why the "scare quotes"? I know you weren't quoting me since I didn't use those words.

                I don't trust a free market to get everything perfectly right. I simply trust a government even less and believe they will screw it up even more.

                • mindslight 5 years ago

                  Are any quotes considered "scare" quotes these days? I used quotes because the philosophy is often referred to as such, but it generally gains legs only when it can be used to undermine market competition.

                  I describe myself as libertarian, and if you want to make the argument that regulation will be transmuted by the incumbents to calcify their own positions, I'm right there with you. But it's wholly disingenuous to assert that services could only be obtusely constrained to the lowest common denominator, rather than considering a sensible system of interoperability that would align incentives properly. Putting it in libertarian terms - instituting a demarcation point such that users' electronic data and identities actually belong to the users.

                  Personally I think the real answer is that we need to self actualize and move away from these surveillance companies, period. And in that context yes it is a worry that regulations could be applied to harass down Free software. But given that we're staring down a much worse censorship regime, from both de jure and de facto governments (eg payment processors), I think we're likely to end up needing to avoid packet censorship anyway. A few additional business-consumer regulations aren't going to appreciably worsen that burden.

                  • mises 5 years ago

                    Having government step in to fix things runs contrary to libertarianism. Firstly, whether "your data" is yours I would dispute. At most, it's produced by interactions between you and a company's software or server, so you own no more than half. You exchange that for free services. You also brought up two entirely different points: making services interoperable and making data produced by users belong to users. I addressed the data one, so next is interoperability.

                    I would say that indeed, we will end up with the lowest common denominator of services, because the services are all different. Also, have you thought about how giving all these companies more access to data would impact the whole problem? Regardless, my argument is more that I see no clause in the Constitution that justifies such an action, and certainly see no justification in morality.

                    Lastly, why are you writing in such a manner? It's clear you're trying to sound smart, but it just comes across as pompous. Just make good points. There's enough jargon on this site without people deliberately adding more. Also, a "demarcation point" does not convey what I think you wanted to ("law"). Furthermore, you cannot "harrass down" anything; that's bad English. Based on a quoted search, it looks like a piece of jargon specific to League of Ledgends gamers. I wouldn't pick on such things but for the insufferable tone. You've got some nice phrases, they're just in the wrong sentences.

                    • mindslight 5 years ago

                      Maybe I'm the output of machine learning, just mushing together popular online phrases. Or maybe I was just a tired human when I wrote that. "harass down" was merely a bad edit - it was "shut down", then I thought "harass" worked better. Sorry.

                      Responding substantively, the difference between a "government" and a corporation to which everybody has involuntarily contracted is in name only. In fact, much of the government overreach in the US isn't based on pure mandate, but rather justified by needing to "opt in" in some way - eg license plates are justified with the fiction that you're choosing to drive on government-administered roads.

                      We don't say that just because you have the choice to move out of state, individual states are libertarian-justified in making whatever laws they'd like. "Migrating" between online services today feels akin to the friction of moving between states - moving away from your social circle. And similar to how the states generally follow national leads, major online services have begin acting in concert like this current censorship push.

                      As I acknowledged, the productive libertarian answer is to seek to change the conditions that give rise to unnecessary power structures in the first place. But when addressing the situation as it stands, giving corporate power a pass is ultimately just deprecating individual freedom. This sector's companies are in the process of mingling with governments regardless (government gives regulatory capture, companies give enforcement legibility), whether or not the interests of the Individual are represented.

            • JetSpiegel 5 years ago

              > Instagram is based around a public photo-blog. Snapchat is ephemeral photo-mesaging.

              > you would be requiring social media sites to replicate each other's functionality.

              This is ironic, considering Facebook's Instagram has already copied Snapchat's funcionality wholesale without government intervention!

            • chillacy 5 years ago

              I’ll add that laws like that are usually well intentioned but make no sense 30 years later, where the law still exists and everyone is wasting time implementing something from the past. Like IE6 browser requirements in South Korea.

          • tqi 5 years ago

            I think what you're proposing is essentially the Graph API?

  • mc32 5 years ago

    Especially with regard to politicians, regulating speech in this area like that video treads right on parody.

    Do they ban SNL impersonations too, Latenight TV mashups? No more Dana Carvey or Alec Baldwin and Tina Fey. Is that what they want?

    • kpU8efre7r 5 years ago

      Does Alec Baldwin ever try to pass as actual Donald Trump or is it obvious that it's parody? The Pelosi video was made with the malicious intent to deceive.

      • mc32 5 years ago

        Some latenight hosts have done mashups. Not just skits.

        Would labeling this on FB as parody be sufficient? Should raw footage be labeled as such. What about taking quotes out of context?

        What about news which edits and editorializes? What about political campaigns which either only pull out the bad on an opponent and only highlights the good in another?

        What about news orgs highlighting something as problem when it isn’t. Maybe animal content in food, but in actuality it’s within regulatory range and isn’t a threat to public health. Or what if they highlight the plight of a particular political prisoner but in general that country treats people well and doesn’t otherwise have political prisoners but the regime slighted the news org so they decide on a “smear” campaign.

        Where do we draw the line? Especially with regards to political discourse, we ought to grant wide berth.

      • bensonn 5 years ago

        7 in 10 voters thought Palin said she could see Russia from her house but it was Tina Fey on SNL.

    • thrusong 5 years ago

      The Nancy Pelosi videos tread on propaganda. It's being used to convince people "Wacky Nancy" or whatever Trump calls her is not fit for office, using actual news clips. It's clearly different from parody provided by a comedian on a show known for being a sketch comedy show.

      • mc32 5 years ago

        Similar could be claimed regarding Kimmel videos on late night but few (maybe some Trump supporters) clamor to shut that down.

      • zaroth 5 years ago

        It’s always propaganda when it’s pillorying someone you like, and parody when it’s pillorying someone you don’t.

        If the video would have been right at home on late night television, it should be right at home on Facebook.

        Do you think the late night shows aren’t being used to convince people to feel a certain way about Trump?

        This is the shoe on the other foot, and if doesn’t feel so right, the wrong lesson is to blame the foot.

        • moorhosj 5 years ago

          Comparing two things doesn’t automatically make them the same. When you see a video on late night comedy tv show, you know you are watching a late night comedy tv show. You even opted-in by turning on that particular channel/show, they also share a schedule of when that content will be aired.

          Facebook has a mix of real news, fake news and propaganda videos it isn’t nearly as clear.

          • mc32 5 years ago

            People [on the left] claim the daily show (and the like) are how the newer generation’s get their news... so your claim isn’t clear cut as you imply.

            • moorhosj 5 years ago

              People claim lots of things, that doesn’t make them so. Watching the Daily Show is a programmed event on a comedy channel that people opt-in to. Something popping up in your Facebook feed because a friend posted it or an algorithm chose it is pretty clearly different.

              • mises 5 years ago

                Clips from such comedy shows pop up out-of-context in the recommended bar in you tube. Did I turn on a channel?

                • moorhosj 5 years ago

                  And those clips clearly show the channel’s logo. So yes, you know what channel it’s from. Either way, this is a discussion of Facebook.

      • golergka 5 years ago

        I think that you're right in that statement, but I also think that this is an editorial decision, not a technological one. If Facebook takes it upon itself to make such a decision, then it acts as an editor of it's users content and should be liable as an editor.

        Not as a communication platform.

        • vuln 5 years ago

          Facebook curates and removes content all of the time. They are 100% an editor and should have their safe harbor revoked as they can and do control what is posted on their platform.

    • tus87 5 years ago

      Sudden break out of common sense on HN. What happened?

  • AlexandrB 5 years ago

    > I don't want a private company to have that power in shaping social discourse.

    They already have that power - they've had it for a while. All that's being debated is how they should use it.

  • TimJRobinson 5 years ago

    I just want more control over what appears in my feed. If all I saw were posts from my friends I never would have come across this video, but instead my feed is a combination of celebrities, pages, promoted content etc. I could unfollow everything outside of my friend group but then Facebook still shows random posts friends (or sometimes barely even aquaintances) have liked or replied to.

    I rarely use Facebook anymore anyway, most of my chat happens in private groups or on scuttlebutt because at least there I have some control over what I see.

  • mgoetzke 5 years ago

    if it was a Disney movie being shared they would be all over it. It would be down in minutes

  • slg 5 years ago

    In the US that would require a constitutional amendment to alter the First Amendment. The chances of that happening are infinitesimally small compared to companies being shamed into doing that "right thing" through public pressure.

    • gjm11 5 years ago

      Until quite recently the FCC had a "fairness doctrine" requiring broadcasters to treat politics evenhandedly. That seems far harder to reconcile with the First Amendment than a "no faked propaganda videos" rule. The Supreme Court explicitly said it was OK.

      The US has truth-in-advertising laws, which forbid advertisers to make certain kinds of deceptive claims. I'm not sure whether the Supreme Court has explicitly considered whether they're constitutional, but it's certainly heard plenty of cases related to those laws and never taken the opportunity to say "oh, and by the way, laws against false advertising are unconstitutional".

      If there's some reason why a law forbidding the use of faked/manipulated videos for political propaganda would be more inconsistent with the First Amendment than those, I'm not seeing it at all.

      • AmericanChopper 5 years ago

        The FCC rules were conditions of broadcast licences, and fairness in advertising laws relate to the prohibition of committing fraud. Neither of those are particularly relevant to concerns around free speech.

        Political propaganda on the other hand is entirely protected by the first amendment. The term propaganda covers nearly all forms of political speech.

    • rickycook 5 years ago

      i don’t think it would tbh... the US already places limitations on free speech: you’re not allowed to advertise using false information, the “fire in a crowded place” type thing

      you might SAY that it’s a slippery slope but i’d argue that it’s likely (not that i know for certain) that there would have been just as much angst about these limits that are not only seen as common sense these days, but necessary

    • mgoetzke 5 years ago

      I can already not post videos of movies I watched on my home TV though. Hollywood is all over that already. Absolute Freedom of speech does not exist anyway. And this is not 'speech' this is not someone talking about a politician it is someone manipulating something he/she stole and misrepresents.

      • raxxorrax 5 years ago

        I would like to see it anyway.

  • MarkMc 5 years ago

    You make it sound like the 'current trend' is a new phenomenon, but before the rise of Facebook newspaper and TV companies were self-policing and it was essentially impossible to publish a message to millions of people without going through some sort of editorial filter.

    The new trend is that someone with a few thousand dollars can tell 10 million people that the Pope has endorsed a particular politician and a million people will believe it. I'd like to think that we are not so easily influenced, but then I find myself clicking on a screenshot I created just 30 seconds earlier and I realise the human brain just isn't designed to be constantly alert to fakery.

  • raxxorrax 5 years ago

    > If we as a society deem that type of content to be improper

    I don't think there is any consent for this.

  • beaner 5 years ago

    Also the internet has always been this way and it's never been a problem. It was only once Trump was elected that the media started to promote the idea of censorship.

Nition 5 years ago

I don't fully understand why people think Facebook should be policing content, yet no-one thinks the phone company should delete SMS messages that are spreading misinformation, or the postal service should destroy false mail etc.

It it because Facebook makes it much easier to share something with a wide audience? Is it because "the algorithm" means Facebook has more responsibility than if it were a simple chronological feed?

  • sgt101 5 years ago

    Facebook alters and manages the information flow to drive its commercial agenda. When it started doing that it lost the defence of being a neutral distribution platform like the mail service.

    • amelius 5 years ago

      But people clearly want to see what's popular in their own bubble. Facebook just gives people what they want.

      • sgt101 5 years ago

        Like heroin dealers? Because people want it doesn't mean that it can be part of a stable society.

        • trickstra 5 years ago

          At this point, deciding what is or isn't part of a stable society, is just anyone's opinion. We don't have that much science describing which types of information are making society stable or not, we can't even tell those types apart and we don't have any science showing that living under censorship is better for society. So comparing the popularity of fake news with heroin is a stretch.

          Why are we focusing on the platform anyway? We know who doctored the video, we know who published it to facebook... why are we not punishing the real origin?

        • amelius 5 years ago

          I don't think the "data addiction" angle has much to do with the problem that is being discussed here.

  • scotu 5 years ago

    The algorithm is probably the biggest element of it, they are already deciding what you are going to see and what you'll not see, so they put themself in the position of being a lot like an editor.

    On top of that a phone call and an sms are 1 to few (mostly 1 to 1), facebook posts are 1 to many, this makes the damage exponentially greater...

    • amelius 5 years ago

      Yeah, but a personal Wordpress website also is 1-to-many. Where will this end?

      • kaibee 5 years ago

        Facebook is a monopoly. There is an impossible inertia to starting a Facebook competitor because they have the network effect on their side (not to mention the money). Their monopoly is basically the same as rail monopolies, only on the social connections between people. As another user said here, they should be required to provide basic interoperability between competing social networks and to be able to transfer your account and data to a competitor. Kind of like a Federalized model. Then Facebook would have to compete on features, not simply relying on the monopoly of "this is where all of my friends are". AT&T is required to interoperate with T-Mobile, etc. The same should be true for Facebook. Trying to equate Facebook and a single WordPress site is just a slippery slope fallacy.

      • AmericanChopper 5 years ago

        But you’re criminally responsible if you use your personal Wordpress site to host illegal content, or use it to facilitate a crime. Facebook would say they aren’t criminally responsible for how their users use their site, because they’re just a common carrier. However they also want to exert a huge amount of editorial control over the content, which could be described as having your cake and eating it too.

        • chillacy 5 years ago

          By “you are criminally responsible” you mean “Wordpress is criminally responsible”? I don’t think the analogy makes sense the other way.

          • AmericanChopper 5 years ago

            If you host your own Wordpress site, and use it to publish illegal content, then you are criminally responsible. Facebook have immunity from being held liable for the content published on their platform granted by the telecommunications decency act:

            >No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

            The party who publishes the content is liable, not the service that distributes it.

            Often not discussed when this law comes up, is that it has a Good Samaritan clause:

            >any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected

            The question is whether Facebook, etc... are publishers of content, or mere distributors of content that others publish. Facebook will tell you that they are distributors of content. Their opponents will tell you that the editorial control they exert over their platforms is beyond the scope of the Good Samaritan clause of the law.

  • bryanrasmussen 5 years ago

    I think both, there is a point where increases in the ease of doing something signifies a categorical change in the activity, and since the algorithm is meant to drive engagement without thought to harm to Facebook's benefit people may think that they have some sort of responsibility to reconsider that algorithm, and mitigate the harm.

  • pimmen 5 years ago

    Yes, the algorithm makes all the difference. I've missed out on two friends who were expecting children now, even though they posted it on Facebook, yet I see all kinds of conspiracy crap from my closest relatives. If this was done through mass SMS this would be different.

    You dobn't have to remove the content either, lower the priority of that content dramatically compared to content about family updates. If Facebook is supposed to be about connecting people, stop the radical from using it as a cheap publishing outlet. If Facebook is a news publishing platform Mark Zuckergberg is the editor in charge. He has to pick a side because it's clear this is not sustainable.

  • Miner49er 5 years ago

    People aren't calling for FB to start policing their content, they've been doing that since forever. FB has never been a neutral platform.

  • ptah 5 years ago

    if the postal service could look at people's response to mail and then intentionally mail out copies of popular mail to more people, then this comparison would make sense

  • khuey 5 years ago

    tl;dr: yes, it's "the algorithm"

    Facebook (and Youtube et al) have chosen not to be content-agnostic distribution platforms (unlike the postal service and SMS). They perform a variety of editorial functions to curate content and provide recommendations with the objective of maximizing engagement and hence advertising impressions that they can sell. This is popularly known as "the algorithm".

    Given that these platforms have chosen to selectively display content for profit, many people believe that they should have some sort of responsibility for the editorial decisions they make.

    • Macross8299 5 years ago

      Facebook isn't an editor, though, it's a curator. You use them interchangably but I think there's a crucial difference. An editor modifies to inject their own standards into the work. A curator just includes or disincludes as-is.

      • sgt101 5 years ago

        Facebook does modify content at the level of the feed; first it includes sponsored posts, second it promotes posts to people leading them to promote them to others.

      • jacobush 5 years ago

        Curating on a fine grained enough level, you are not only an editor but even a story teller. You can make a narrative only quoting other people and it has been done many times in books and movies.

        That this story is driven only by an advanced self-adjusting Skinner box to maximize advertiser revenue, is very weird. You never expect sci-fi to be so mundane when it becomes reality, but it always is.

    • ng12 5 years ago

      But "the algorithm" is impartial. Editorialized content implies there's a message being driven.

      • archgoon 5 years ago

        The selection of the algorithm is not impartial. Algorithms are chosen to maximize some quantity for the benefit of the corporation.

        • ng12 5 years ago

          Impartial as in it's not treating certain types of content differently. It's one thing if "the algorithm" decided to push (or censor) an anti-Pelosi agenda across the board, another if it pushes anti-Pelosi content to people who already are anti-Pelosi.

          • archgoon 5 years ago

            The entire point of a content algorithm is that it handles different content differently.

            Regardless, the selection of the algorithm is not impartial. Content is pushed up or down to maximize engagement or whatever metric Facebook has decided it wants to maximize.

  • vneumanarc 5 years ago

    "People" don't. I don't think social media companies should be policing anything. They should be agnostic platforms.

    It's the establishment and the media/news companies that are demanding facebook censor according to their liking.

    And what of "dirty tricks". There are "dirty tricks" in political ads/content all over TV, newspapers, radio, etc. Hell, CNN/NYTimes and much of the media outright colluded with Hillary and the Democratic party. Foxnews backed Trump.

    If "dirty tricks" is this upsetting, then we might as well shut down all media.

  • dqpb 5 years ago

    I for one think it's outrageous that the USPS delivers so much unwanted and dangerous spam mail.

    • chillacy 5 years ago

      A huge waste of resources too. If the price of recycling, cost of growing and cutting down trees, carbon emissions from driving spam, etc aren’t factored into the cost then we’re all paying for it.

ar-jan 5 years ago

Interesting how they're sidestepping the actually rather disfluent speech from Pelosi that prompted questions. Compare https://twitter.com/robbystarbuck/status/1131242531503534080 with the C-SPAN video: https://www.c-span.org/video/?460990-1/speaker-pelosi-presid... (e.g. from around 3:00): not edited.

rlt 5 years ago

A bit ironic coming from Kara, who jumped on social media to condemn the Covington Catholic students before the whole story was out, but at least she admitted her mistake https://twitter.com/karaswisher/status/1087443815269584897

  • Simulacra 5 years ago

    The tech media is just as much about gotcha stories as the rest of the media. Worse so, because the tech media is absolutely dependent upon its access to the tech industry, which it must always be careful to curb their reporting so as to not lose access.

manfredo 5 years ago

I don't think the idea that companies should host all content except for illegal content is a better approach. Porn is legal, as per the first amendment. Does that mean every social media platform is obligated to allow it? Furthermore it would eliminate the possibility of themed platforms. Things like pixiv and DeviantArt would have to allow content totally unrelated to art.

The whole discussion is moot, at least not without a significant reinterpretation of the fist amendment. Remember, the first amendment also protects against compelled speech. The government cannot mandate people or companies to make speech it does not want to. There's some talk about whether making these social media platforms utilities could allow the government to compel speech, but that seems far fetched at the moment.

  • ianai 5 years ago

    Actually, it’s entirely possible to have different legislation for different sized organizations. The tax bracket for individuals is an example.

    I’m more of the opinion that Facebook should be punished as an example. Don’t grow so big that your unilateral action could have devastating effects on society. Don’t do things with peoples data that remotely rises to the 1984 totalitarian level. See how fast we have publicly owned holdings companies of regional wellsfargo/BoA type banks, facebooks, credit reporting agencies, etc. It’s inline with concepts of having different forms of government at the local, regional, and federal levels.

    • manfredo 5 years ago

      > Actually, it’s entirely possible to have different legislation for different sized organizations. The tax bracket for individuals is an example.

      Still illegal as per the First Amendment. The government cannot compel speech, regardless of company size.

      > I’m more of the opinion that Facebook should be punished as an example. Don’t grow so big that your unilateral action could have devastating effects on society. Don’t do things with peoples data that remotely rises to the 1984 totalitarian level. See how fast we have publicly owned holdings companies of regional wellsfargo/BoA type banks, facebooks, credit reporting agencies, etc. It’s inline with concepts of having different forms of government at the local, regional, and federal levels.

      I'm not sure how any of this has to do with content moderation.

      • ianai 5 years ago

        Facebook has a huge audience. They make it easy to for one entity to target a huge number of people with customized content per person. And yet people have some weird affinity to believe anything they see on it. If you instead had to work with several hundred facebooks across the world to target everybody it becomes more federated. The local facebooks would have more knowledge and incentive and resources to police their content for abuse.

        • ianai 5 years ago

          I can imagine a few ways this may be legally executed. One is to use the 1800s anti-trust laws to forcibly break up Facebook. Create the smaller entities I described in the process and say “this is the standard for a social media company.” Another would be to endow or reinforce citizens private data as some form of personal property. Give them the legal means to sue and demand payment for any storage of their PI. Make the costs of offenses steep. I am definitely not a lawyer though and not as up to date as I should be for this sort of spit balling.

          Edit-I think they could also tie a temporal component and per instance cost of storing someone’s “PI”. That value could then be a basis to tax the corporation- create an incentive to “forget”.

          Edit2-make a persons likeness and social media posts part of this “PI.” This would give Pelosi the legal right to the doctored content.

          • BaronSamedi 5 years ago

            I agree that breaking up these companies is the best option. I can understand the calls for regulation but that is only treating the symptom and will have negative side-effects. The size and position of companies like Facebook leads to a concentration of power which is almost never desirable.

        • chillacy 5 years ago

          I can’t think of any internet companies that naturally federate well. The nature of the internet being one place lends itself to winner take all mechanics, and this makes the world a smaller place geographically and a bigger place ideologically. Now we can chat on HN because we share interests (ideologically close) even if we live in different countries, but we may not chat at all with people outside our circles in our own towns.

      • archgoon 5 years ago

        > Still illegal as per the First Amendment. The government cannot compel speech, regardless of company size.

        The United States Government can't; however, the U.S. tends to have stronger free-speech laws than other countries, and as we've seen with GDPR, other governments (including some democratic governments) may choose to legislate additional restrictions that Facebook might end up just deploying everywhere. Not saying that this would necessarily be a good outcome.

  • rlt 5 years ago

    I completely agree when it comes to small to medium-sized websites, but Facebook is effectively a monopoly. At a minimum they should have clearly defined rules and transparent processes for censoring content and banning users.

    • manfredo 5 years ago

      > At a minimum they should have clearly defined rules and transparent processes for censoring content and banning users.

      They do their best to document this: https://www.facebook.com/communitystandards/

      The simple reality I see is that Facebook has to please the majority of people, but people are reluctant to confront the fact that some of their views are significantly out of line with the majority. So people react negatively when their own content is banned, and also when content they think should be banned is allowed.

      That and an either an unwillingness to accept or understand the limitations of automated content moderation. There was a post on the front page of Reddit about an artist who got their Facebook page taken down for making KKK-style MAGA hats, and nazi-style MAGA armbands (crucially, with a swatstika). It's unambiguously anti-Trump in context, but of course it's going to get automatically flagged when it prominently features items strongly aligned with hate speech.

      • gmueckl 5 years ago

        Any content moderation is bound to be imperfect and at that scale the absolute number of false negatives and false positives is enormous no matter what. Human moderators aren't perfect either. Automated moderation is not always accurate, but it will at least consistently make the same decision for the same comtent. And if you can use it to hide the most excessive stuff from human moderators to protect their mental health, then I think this is not a bad way to run things.

      • whenchamenia 5 years ago

        It should not matter if your views are far from the majority, that is the whole issue.

        • chillacy 5 years ago

          If you can have a plan to solve aspects of human nature that are no longer useful in the modern world let us know. Otherwise I assume we just have to accept that people are tribal by default, and only a small percentage of people have the time and desire to cultivate otherwise.

  • golergka 5 years ago

    > every social media platform

    Word "every" implies that there are a lot of them in more or less equal competition, but there aren't. The issue here is monopoly: Facebook and Twitter have too much influence.

    Social media should go through process of federalisation and become like email and websites, so that every user can consume content hosted by different companies and each of those companies can determine it's own rules on it.

    • chillacy 5 years ago

      You mean like reddit and voat? Or as if each subreddit were its own company with some open protocol for cross posting? Or do reddit Twitter fb HN all have to support cross posting?

_cs2017_ 5 years ago

What should be the rule about what videos are banned? We don't want to ban parodies I suppose? What if the author of the Pelosi video claims it's a parody?

It would be great if we could allow parodies and jokes as long as a reasonable person can tell that this is indeed a joke. But that won't work since people disagree about what's "reasonable".

I guess we can require that parodies and jokes are tagged as such, so people don't confuse them with real stuff. Is that too overbearing?

Are there any better solutions that have been proposed?

  • freewilly1040 5 years ago

    It’s not as though you can use the magic p word and then what you did is parody. What’s parody and what’s not is indeed a difficult line to define, but it’s not a new problem or a problem that we lack prior art for.

    • _cs2017_ 5 years ago

      > it’s not a new problem or a problem that we lack prior art for.

      The prior art I'm aware of comes from the copyright, trademark, and defamation laws, but the definitions developed there are so vague that it can take a few months in court (and thousands of dollars) to decide whether something is a parody / satire. Here we need a rule that can applied within hours, at a very low cost. So I'm not sure how we can adapt existing legal definitions to banning videos on the internet.

      Could you describe in a bit more detail what exactly you propose?

  • raxxorrax 5 years ago

    I just want people to accept the fact that not everything you read/see on the internet is the absolute truth. Younger people learn early to navigate the web and additional education can keep their data safe. So what is the problem?

    Facebook is too popular? I agree, but that is something everyone has to decide for themselves.

    I do think people who are frustrated about current global politics want to restrict the net in the hopes that people adopt "correct" positions.

  • MarkMc 5 years ago

    My preferred solution is for Facebook to assess whether a video is presented as accurate but has been altered with intent to deceive. If so, it is tagged "This video has been flagged as deceptive. Click here for details".

    Yes, from time to time I might disagree with Facebook about the accuracy or intent of a video, but it would be an improvement on the status quo.

    • _cs2017_ 5 years ago

      Then what would prevent people whose videos were banned from suing Facebook because Facebook didn't get their intent correctly? And similarly, people appearing in videos that aren't banned could sue because Facebook didn't see the evil intent.

      Such lawsuits can be numerous and costly enough to kill the entire concept of hosting user uploaded videos. How to avoid this?

      Maybe it's better to drop the intent thing and just ban any videos that look real but aren't? Even if it means banning some parodies such as this https://youtu.be/cQ54GDm1eL0 Even then we'll need to figure out how do it at scale, since a human reviewer will often not know if it's real.

      I don't think we can limit the ban to videos where you can see people's faces; you can create quite bad misinformation by showing a military boat of some country sinking a civilian ship etc.

      • codyb 5 years ago

        I’m pretty certain Facebook’s terms of use would rule that out and probably already has a clause about how Facebook can use, share, or remove your content without your express consent at any time already.

kneel 5 years ago

Facebook didn't doctor the video, out of all the content they could be blasted on this is pretty trivial.

This is part of a larger trend in MSM to pry eyeballs away from social media and back to traditional one way media.

  • Traster 5 years ago

    Facebook knew the video was doctored- it was a mainstream news story for days. They chose not to content ID it, but instead to continue spreading it. That seems deliberate.

    • makomk 5 years ago

      It was also a mainstream news story for days that the White House spread a doctored video which was sped up to make it look like a CNN reporter attacked a White House intern. Thing is, it was trivial to verify that wasn't at all true: if you actually compare the speed with the original frame by frame it's not sped up anywhere. The artifacts some reporters claimed were from it being sped up are actually frame rate upconversion from the exact, specific 15 FPS GIF its original poster claimed as his source.[1]

      On the other hand, there was a super-viral video about this on Twitter that was really obviously doctored. It "proved" that parts had been sped up by overlaying the video on a 7-frame-behind version of the original. Unless you paused the video this desync was only noticable in faster-moving sections, creating the illusion those sections were sped up. Not one news reporter spotted this. People like Captain Disillusionment happily spread it as fact.

      If we based our decisions about what's doctored and should be blocked from spreading and what isn't on the media reporting, it'd mean treating truth as lies and lies as truth.

      [1] Seriously. You can replicate the incredibly simple frame rate conversion (just blending the frames on either side) in a few lines of Python and get every single one of those artifacts out: https://www.makomk.com/2018/11/16/recreating-that-white-hous... From what I can tell this is the default setting in Sony Vegas, which is what the guy used.

      • mgoetzke 5 years ago

        How he made it appear does not matter though. The reporter still did not attack anyone.

    • Macross8299 5 years ago

      Why is Facebook obligated to play the role of referee in games of political fact?

      • sgt101 5 years ago

        Because that is the role they have chosen by becoming a non neutral information distributor.

        • kneel 5 years ago

          The medium is the message, study the medium and it's effects on how information is distributed.

          The genie is out of the bottle. Unless you want a 1984 style propaganda you have to accept that media is different from 10 years ago and it'll never be the same.

          • sgt101 5 years ago

            You are right, it will never be the same. However I reject that the only alternative is 1984. The medium (facebook, social media) is malleable, the current choices are turned up to "engage 10" but this is only one of a range of options. Manufacturing can be run to pollute rivers, the air, kill folks, but with some short term costs manufacturing can be cleaned up. Social media is the same. Just as Shell and BP should be loaded with the cost of carbon hitting the atmosphere Facebook should be loaded with the cost of destroying civil society.

        • raxxorrax 5 years ago

          Kind of true. The only way I see it is to remove content moderation. Yes, you will find something you don't like. Maybe your kids too, but they are probably more fluent in online spaces anyway.

  • Macross8299 5 years ago

    Yeah, I really have noticed an increase in "Facebook is bad" content from mainstream media, usually blasting them because people use the platform to spread fake news as people have used any other platform to spread fake news since time memorial. Where were these hand-wringing articles when Facebook was pioneering the degradation of privacy standards or pushing invasive tracking across the web? I guess those don't threaten the old media business model as much.

    • vharuck 5 years ago

      >Where were these hand-wringing articles when Facebook was pioneering the degradation of privacy standards or pushing invasive tracking across the web?

      Indifference to privacy becoming vogue often doesn't have immediate and noticeable consequences. Many on HN have the experience and technical knowledge to see the consequences, but they're an exception among the general population. But it's easy to notice when a friend or family member repeats a lie they saw on Facebook.

      I'll assume people are reacting to immediate and "obviously bad" consequences. A little more evidence is needed before I go with "hypocritical ulterior motive."

  • raverbashing 5 years ago

    What MSM media?

    Why is Fb not investigating the original sources and propagators of that video?

    Meanwhile the president can appear on any video saying whatever doctored or not and it won't matter one bit

hirundo 5 years ago

A partial distributed solution would be for trusted sites to vouch for a hash of the video file, and the browser to be configured to flag or filter untrusted content.

E.g. NYT, the DNC, Pelosi's own site, etc., would flag a hash of a file as valid. Those sites would be aggregated into a web of trust. Users subscribe to nodes of that web, e.g. I trust reporter A, celebrity B, newspaper C, advocacy group D ... and their trust webs out to a depth of X.

Is there anything like this in the works? Kind of a DNS service for propagating networks of trust ratings?

  • ng12 5 years ago

    Corporate censorship built in to my browser? I'll pass.

  • tqi 5 years ago

    Wouldn't something like this further polarize the web? ie what is to stop Alex Jones from setting up his own node? Or what if the RNC and DNC mark the same article about climate change as fake / not fake, respectively?

  • TimJRobinson 5 years ago

    I'm working on a design of a system that does almost exactly this. I'd also be interested if anyone else is working on it. Where you can not only trust sites but individual people too.

    The core issue I've come across is that it would make cults happen more easily.

  • chillacy 5 years ago

    I’ve thought of it but it doesn’t allow news sites to remix the content into edits. Maybe if you did some sort of content based hashing which is sensitive to playback speed and pitch but not added compression or cropping?

    • codyb 5 years ago

      It’d be interesting if there was a feature which would let you generate hashes from time series. You could specify what second or millisecond and duration and the trust web could provide the hashes for those segments.

      Memoization might help although if it’s to the millisecond maybe not.

gopher2 5 years ago

So, what's the normative suggestion here? All content on user-generated websites should be moderated for truth and anything edited, editorialized, fictional, or not strictly representing reality should be removed?

Or is the idea that only when there is enough public pressure to take down a particularly popular and deceptive video ... then it should be removed?

It seems sort of ridiculous that we should engage in across-the-board censorship of anything mis-representing reality in any way. OTOH removing things only as they become popular enough to go viral and meet some agreed upon definition of "fake" enough seems like it would establish a permanent, weird shouting match with no clear rules.

Or is the idea that we should ban and penalize Facebook specifically? Because they're Facebook. I don't really get that, but okay. We could do that. I don't think it would solve the user-generated content problem or the fake news problem.

We could not allow any website with posts, images and comments that isn't filtered through a some sort of sacred guardian of truth/editorial board. That sounds like a pretty locked-down version internet.

As some other commenters mentioned, the fact that the President/right-wing seem into this sort of approach and that this video became popular in the first place is a separate, sad issue.

I'd love for the NYTimes to spell out what they're advocating FOR as the solution. If it's just "delete facebook" and read our comments section instead, well-played I guess.

  • PixyMisa 5 years ago

    Farhad Manjoo asked that on Twitter. The response was "Hate speech comes down." which was remarkably unhelpful.

  • codyb 5 years ago

    Maybe there should be a threshold such that things seen by a million or more people could could be flagged for a moderator to take note of and tag with certain contextual clues for the next however many viewers (“satire”, “unproven”, “doctored”, “verified”, “trustworthy source”).

bArray 5 years ago

Firstly, if you watch both videos side by side, the effect isn't that great [1]. I thought she was drunk in the original.

Secondly, do you really want to make Facebook the great filter of political truth? Is this a role you want to give to a private company with its own agenda? Of course, for somebody working at The New York Times this kind of power is a dream scenario, where truth is simply whatever they write.

Criticism of the opinion article itself:

> Facebook decided to keep up a video deliberately and

> maliciously doctored to make it appear as if Speaker Nancy

> Pelosi was drunk or perhaps crazy.

Interesting wording, did Facebook decide to keep the video up or did it not decide to take it down? An assumption of malice is a dangerous place to start.

> The social media giant deemed the video a hoax and demoted

> its distribution, but the half-measure clearly didn’t

> work.

This is the truth of their opinion, only wiping all traces of problematic material from the internet is an acceptable solution, anything else is a half-measure. What if this video was a piece of art, a political statement, a meme? Is there anyone who we can trust to make the decision without asserting their own bias?

> “We think it’s important for people to make their own

> informed choice for what to believe,” [..] This is

> ridiculous.

When did a company passing the responsibility of interpretation to the viewers become controversial? Why is it ridiculous to think of content consumers as having the ability to decide what is real and what is not?

> Would a broadcast network air this? Never. Would a

> newspaper publish it? Not without serious repercussions.

> Would a marketing campaign like this ever pass muster?

> False advertising.

Newspapers and serious repercussions? You mean a (comparatively) small monetary fine at most? The biggest consequence of bad journalism I've ever seen was the phone hacking scandal for News of the World, where Rupert Murdoch was basically told to reorganize his assets and make the name "News of the World" disappear [2].

[1] https://www.youtube.com/watch?v=sDOo5nDJwgA

[2] https://en.wikipedia.org/wiki/News_International_phone_hacki...

Macross8299 5 years ago

>so are New York Times articles, because classy journalism looks good on the platform

Interesting that a journalist thinks that Facebook is optimized for what "looks good" rather than what drives eyeballs. Facebook is just giving the people what they want. (To say nothing of how self-congratulatory that sentence is)

  • Despegar 5 years ago

    There's nothing self-congratulatory about that sentence at all. Facebook's algorithm used to promote a certain kind of clickbait until they changed it in favor of real news (which killed a bunch of media startups that were optimizing for Newsfeed traffic). Facebook has specifically courted media organizations with Instant Articles or video for Facebook Watch.

warp_factor 5 years ago

that article is pushing the idea that Social networks should be better at policing speech on their platform.

As said in other comments pushing for more self policing is the worst thing that could happen for democracy and free-speech. The last thing I want is for a small clique of silicon valley tech execs to decide what I should or should not see on my feed.

I see two solutions on this:

- Censorship is coming from an elected government body

- No censorship and we let people decide what is news and what is fake news.

But having Zuck and his team manipulate newsfeed to push a political or social agenda is a terrible idea. This article is a shame.

  • bfdm 5 years ago

    How are people supposed to make that decision if they are not armed with the information needed? If they only see the fake version, how will they know to even question it?

    Truth matters. News distributers are expected to fact check and should be held accountable when spreading falsehoods.

    Removing provably false manipulated media is not itself a manipulation. It is a correction.

    • ng12 5 years ago

      Agreed, except Facebook is not a news distributor.

      • anonymousab 5 years ago

        They are a de-facto distributor, even if it wasn't their intention to be one.

  • vharuck 5 years ago

    But isn't "self-policing" the failsafe of free speech? Anyone can share their ideas, no matter how dumb or wrong. Except in certain cases, even deception is allowed. Then everyone else decides what is a good idea worth sharing, or just drivel.

    Facebook is a non-government entity. They're "people," too. They can totally decide to stop sharing junk when they notice it. And other people, like the article's author, can ask them to do just that.

StanislavPetrov 5 years ago

Thousands and thousands of videos and pictures are edited and altered to mock famous people and politicians every day, why is this suddenly an issue? Its astounding to me that anyone would support having overseers like Facebook (and their government partners at the Atlantic Council and elsewhere) deciding what is okay for people to post and view.

neilv 5 years ago

I think part of the problem here is that Facebook doesn't want to be an impartial common carrier, so they have to take responsibility for "content".

Then you have one of the most influential venues (which has taken over much of the use cases of the original open Web and Internet) having to answer to politicians about what speech it should censor.

  • raxxorrax 5 years ago

    You forget that the most influential venues all have business ties to Facebook for visibility reasons. That is a huge problem, since Facebook might indeed be interested in these venues staying influential and to suppress peoples ability to meet on other platforms. Hard regulation would fortify their dominant position.

    I remember some years ago that when papers wrote articles about how they will keep being critical on Facebook, despite their cooperation. That was sad to read.

ddffre 5 years ago

I don't see any problem of posting some humor related content.

merpnderp 5 years ago

“No other media could get away with spreading anything like this...”

Meanwhile several major media sites are claiming the video Trump shared was this same doctored video, when it was merely a montage of regular video clips.

We can’t even begin to have a conversation about censoring private companies without it immediately being used as a political weapon.

DyslexicAtheist 5 years ago

I wonder if it would have taken FB that long to demote the video if it had depicted Sandberg or Zuckerberg slinging racial slurs.

  • tomek_zemla 5 years ago

    That is an interesting question and could be empirically determined with an experiment.

    • Gibbon1 5 years ago

      That's my suggestion.

  • m0meni 5 years ago

    Hate speech and a slowed down video are two very different things.

    • DyslexicAtheist 5 years ago

      slowed down to make the person look .... well sloooooooooow and senile. fact is the campaign was successful in fulfilling it's goal: disinformation.

      do we have to discuss if ageism is worse (more evil) than racism or vice versa? who will be the judge? isn't a spade a spade and "toxic" is simply toxic?

      • darkpuma 5 years ago

        Changing two things at once (the target of the slander and the nature of the slander) is bad science.

  • dominotw 5 years ago

    Can be argued that Sanburg isn't a public figure. Very few ppl know her outside tech/leanin circles.

hsnewman 5 years ago

Propaganda is not new, especially to fascist run governments. I for one am very concerned about our republic.

  • nailer 5 years ago

    Please don't use Hacker News for political or ideological battle. It destroys intellectual curiosity, which the site exists for.

m0zg 5 years ago

Established, "old media" news sources routinely spread misinformation as well either deliberately made up, or by omitting facts that don't fit the narrative of their owners. News outlets haven't been about "news" for several decades now, it's all about, to quote Chomsky, "manufacturing consent". It's just that now anyone with a webcam and a video editor can do the same thing on social media. Some, in fact, get more viewers than "traditional" media, too.

Tsubasachan 5 years ago

America right now is the best show on television. Emmy's for Nancy and Donald.

bayesian_horse 5 years ago

We have to kill facebook. They simply know to much! [irony/joke]

Proven 5 years ago

Haha that is funny. Gotta watch that video!

To the critics of free speech: I don’t think it will make me think less of Nancy Pelosi - in my book she is already ranked at the very bottom.

williesleg 5 years ago

I believe everything on the internet. It doesn't polarize me either.

davidw 5 years ago

I think several things can be true at the same time:

* It's pretty worrying that our fascist-adjacent president is spreading doctored videos with essentially no repercussions.

* Asking FB/Twitter/etc to police videos could be quite a mess in its own right.

* This is a pressing problem for democracies around the world.

(Note: I don't use the 'f word' lightly. Too many on the left use it freely with "people on the right they don't like". I think a person who loathes the free press, loves dictators, chafes at the idea of the rule of law, and threatens violence against political opponents fits the bill, though).

  • jnbiche 5 years ago

    > Note: I don't use the 'f word' lightly. Too many on the left use it very freely with "people on the right they don't like". I think a person who loathes the free press, loves dictators, chafes at the idea of the rule of law, and threatens violence against political opponents fits the bill, though)

    You forgot "hates an independent judiciary" and "uses office for personal enrichment". Some may quibble with the use of fascist to describe him, but he's very clearly authoritarian and anti-democratic. What has surprised me is the relatively large percentage of my compatriots who are also authoritarian and anti-democratic. All that stuff about the constitution was just lip service.

    • js2 5 years ago

      > What has surprised me is the relatively large percentage of my compatriots who are also authoritarian and anti-democratic. All that stuff about the constitution was just lip service.

      To the extent that the constitution maintains the status quo, it is embraced. To the extent that it upsets existing social order, not so much.

      According to Stenner's theory, there is a certain subset of people who hold latent authoritarian tendencies. These tendencies can be triggered or "activated" by the perception of physical threats or by destabilizing social change, leading those individuals to desire policies and leaders that we might more colloquially call authoritarian.

      ...

      Authoritarians prioritize social order and hierarchies, which bring a sense of control to a chaotic world. Challenges to that order — diversity, influx of outsiders, breakdown of the old order — are experienced as personally threatening because they risk upending the status quo order they equate with basic security.

      https://www.vox.com/2016/3/1/11127424/trump-authoritarianism

    • manfredo 5 years ago

      > Some may quibble with the use of fascist to describe him, but he's very clearly authoritarian and anti-democratic.

      That's pretty strange, seeing as he ran under the party that generally strives for lower government control in our lives - the direct opposite of fascism. And he has lessened government control in a variety of ways. One example (albeit one that is probably unpopular on HN) is deregulating telecoms.

      The claim that Trump is even fascist-aligned is really only defensible if one uses the modern definition of fascist: a person that holds political views one deems offensive or bad.

    • yspeak 5 years ago

      In Comments

      Be kind. Don't be snarky. Comments should get more thoughtful and substantive, not less, as a topic gets more divisive...

      When disagreeing, please reply to the argument instead of calling names. "That is idiotic; 1 + 1 is 2, not 3" can be shortened to "1 + 1 is 2, not 3."..

      Eschew flamebait. Don't introduce flamewar topics unless you have something genuinely new to say. Avoid unrelated controversies and generic tangents...

      Please don't use Hacker News for political or ideological battle. It destroys intellectual curiosity, which the site exists for.

  • dominotw 5 years ago

    I was watching bill maher show on hbo and there was some older female guest( don't remember the name) and she was like maybe we should contact our uae friends and have them give our president the khashoggi treatment. Everyone in the room started clapping and cheering. wtf.

    • suby 5 years ago

      I can't really comment on that without seeing the clip myself, but in general, I find that show and other's like it to not be conducive to good conversation.

      The audience is always extremely biased, and stops the flow of conversation every other sentence to clap and cheer. People in turn seem like they're just trying to get a soundbite off with a rise out of the audience. And then on top of this, the panels are 5 people, which feels like entirely too many.

  • nailer 5 years ago

    Please don't use Hacker News for political or ideological battle. It destroys intellectual curiosity, which the site exists for.

    • davidw 5 years ago

      I generally agree strongly with that, but FB is caught up in this - and social media is in general.

  • drilldrive 5 years ago

    Trump didn't spread the video, his lawyer Giuliani did. Trump pinned a video of spliced moments of Pelosi stumbling on her words: each segment was raw footage.

    • lazugod 5 years ago

      How is pinning the video not spreading it?

      • PixyMisa 5 years ago

        It's a different video. The one Trump retweeted is undoctored footage of Pelosi stumbling over words, because she actually does that.

        • zaroth 5 years ago

          It’s amazing how hard it is to get someone to believe something counter to their baseline assumption.

          People “know” Trump spread the video, even though it’s simple irrefutable fact that he did not.

          The media is often responsible for amplifying and distorting an act to the point where the original act and actor are indistinguishable. Kind of exactly like what happened in the doctored video.

          This is perhaps an incredible example of the pot calling the kettle black. Just yesterday, the TIMES reporter tweeting a fake quote of the President, only admitting he made it up after it went viral, and claiming, “it was plausible”.

          IMO the media does this against Trump pervasively and highly effectively, to the point where it is nearly impossible to distinguish what Trump may have actually said, forget about what he meant.

          I’d say the Charlottesville comments are probably the most famous example of this.

          • makomk 5 years ago

            It's worth thinking, too, about the role this New York Times opinion piece played in convincing people so strongly of something that isn't true that the untruth was upvoted whilst the comment pointing it out is greyed out. They make it sound like the perception she was "drunk or perhaps crazy" was entirely the result of the deliberately and maliciously doctored video. Clearly it wasn't. Anyone who didn't investigate things themselves is left with the false impression that everyone mocking this Pelosi appearance was using a fake doctored video to do so - including the President of the United States - even though the author of this never technically lied.

            • drilldrive 5 years ago

              Currently my comment is +4, so I wouldn't be so quick to say it is greyed out. Regardless, claims based on vote totals are not useful and distract from discussion: vote totals consider only the subset of the population willing to participate in site-based voting, which will pull from the polemics much more than the moderates. And besides, the structured refutation comment format of mine above is rife for false corrections, in that frequently such comments are only be a distortion or a deflection of the truth. People learn to blind themselves of the content of the message over time, furthering the partisan lock on their minds.

          • jccalhoun 5 years ago

            >IMO the media does this against Trump pervasively and highly effectively, to the point where it is nearly impossible to distinguish what Trump may have actually said, forget about what he meant.

            >I’d say the Charlottesville comments are probably the most famous example of this.

            Could you explain why this is an example?

        • mgoetzke 5 years ago

          And he never does :)

          • PixyMisa 5 years ago

            Happy covfefe to you too. :)

      • drilldrive 5 years ago

        You are misreading my comment: there were two major videos of Pelosi that came to the fore recently. The one that Giuliani posted on his twitter was the one mentioned in the article, and the footage and audio was distorted and slowed down. Trump did not post this video. Trump posted the second video comprising of spliced segments of Pelosi speaking, each segment was raw footage of the House Speaker.

        • masonic 5 years ago

            the footage and audio was distorted and slowed down
          
          Jimmy Kimmel has done this hundreds of times in his recurring "Drunk Donald Trump" segment over the past 3.5 years, usually with no disclaimer. Should he be censored?

          Broadcast television has much stronger legal standards than Facebook would ever consider, too.

          • plugger 5 years ago

            No, because it's comedy. If people are getting what they consider "news" from late night comedy they don't understand the purpose of the show. I'd say the same thing about the Daily Show, Last Week Tonight, etc. They're not news, they're comedy shows that focus on current events. The larger elephant in the room are the news network opinion shows like Hannity and Carlson. These shows open with something to the effect of "this is just my opinion" but then spend the rest of the show speaking as everything they're saying is fact when in fact it's opinion and 100% not news.

            • ng12 5 years ago

              Why do we expect people to make this distinction for TV but not for things they read on the internet?

              • plugger 5 years ago

                I don't. I expect to make this distinction no matter where I consume media. If anything the Internet is even worse though as the lines are much less well defined.

          • davidw 5 years ago

            This is why it's kind of tricky. I think most people get that comedy people do one thing and are ok with that as long as it's clearly labeled. Perhaps the president and people who work for him ought to be held to a different standard though?

  • PixyMisa 5 years ago

    Is it a pressing problem, though?

    Google William Randolph Hearst. Or look up the antics in some of the US elections in the early 19th century.

    Also, Trump DID NOT SPREAD THE DOCTORED VIDEO. The video he retweeted was undoctored, merely a collection of clips from C-Span. The reason there were no repercussions is because he did absolutely nothing wrong.

ntetsuo 5 years ago

This whole thing seems retarded (go ahead and knee jerk). To argue that social media should be regulated because ignorant masses may be politically swayed is not “progressive”, it’s just another way to control information.

Barrin92 5 years ago

The most baffling quote from the article, citing Zuckerberg

>“I’m Jewish, and there’s a set of people who deny that the Holocaust happened. I find that deeply offensive,” Mr. Zuckerberg said. “But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong.”

Ah yes, the very common good faith nazi. You seriously have to wonder if Mark Zuckerberg would have such nonchalant views about speech and misinformation if he didn't have a tail of private security and a fenced in mansion.

What's the Muslim store owner in Myanmar going to do when the mob comes knocking on the door after some lie about him got the local village riled up? The degree to which these people live in an alternate reality isn't even funny any more. Nothing has consequences for Zuckerberg et al and it has distorted their worldview.

  • gfodor 5 years ago

    I'm assuming people railing on Zuck here don't like Zuck, so it's odd to me they would prefer he become the world's arbiter of permitted speech.

    • Barrin92 5 years ago

      I don't prefer he become the arbiter of speech. I'd prefer the government would regulate his business and hold him accountable for the damage that lies and misinformation cause on his platform. Given that he recently called for more regulation, surely he'll agree.

      • chias 5 years ago

        > I don't prefer he become the arbiter of speech. I'd prefer the government would [...] hold him accountable for the damage that lies and misinformation cause on his platform

        I'm understanding you correctly, you believe that Zuckerberg should be held liable for untruthful content hosted on its platform, but you also believe that Zuckerberg should not be allowed to choose what content is hosted on its platform.

        How do you harmonize those two fundamentally contradictory viewpoints? Or do you feel that the only "correct" way forward is to excise Facebook in its entirety?

        • Barrin92 5 years ago

          I think the platform should be held liable if it causes damage to individuals by spreading false information about them. For example we could start by handing out fines to platforms if they do not remove damaging or false information in timely manner.

          I don't think the platform needs to go entirely, I just think it needs to get an incentive, financially for starters, if it fails to adhere to standards set by legislators.

          • chias 5 years ago

            In your ideal case, lets say that Facebook identifies some content that it believes is spreading false information about somebody. Should Facebook be allowed to remove it?

            If no: Facebook is being fined for not taking an action that it is not allowed to take, which I can't imagine you are advocating for.

            If yes: Facebook is now the arbiter of what information gets to be seen (on Facebook), which you didn't want.

            A third option, of course, would be to let the government (all governments?) decide when something should be removed from Facebook, but frankly I think this is the worst option of the three.

            I suspect that you meant liability in cases when something is "clearly" untrue, so my comments likely come off as pedantry. In cases like this video it's very easy to make that determination. But when it isn't so clear cut, there will have to be judgement calls, and if we're talking actual liability then someone has to be empowered make them. Who gets to make these judgement calls? Either Facebook, the US Government, any government including the ones that you don't like, or maybe some supposedly non-biased third party. None of these options sound at all good to me. I don't know what the answer is, I just want to point out that whatever it is, if it even exists, it isn't as simple as that.

            • gfodor 5 years ago

              We all know that when people talk about coercion of removal of online content, they're talking about content they deem worthy of removal.

              You can avoid cognitive dissonance if you assume you have the "right" perspective on how to bucket content into two groups, acceptable and unacceptable, and that the bucketing made by regulators will be identical to yours, since it's clearly the only objectively correct one. Of course, this is so far from reality if such a thing were to be implemented that it's not worth taking seriously.

      • darkpuma 5 years ago

        When Zuckerberg calls for more regulation, he's actually requesting regulatory capture that ensures future competitors to facebook will be fighting an uphill battle.

      • mynameishere 5 years ago

        Man, I'd prefer the government would regulate you. See how easy that is?

        • Barrin92 5 years ago

          I already work in a heavily regulated industry because my job has security implications, I have absolutely no issue with it.

  • dominotw 5 years ago

    Those aren't two consecutive statements. Totally ridiculous to put them together and make it out to be something else.

    Quite an ironic comment in a thread about doctored video.

IlegCowcat 5 years ago

Facebook invades privacy at minimum. It threatens our democracy.

frequentnapper 5 years ago

Why can't Nancy Pelosi sue facebook for $100 billion for allowing a doctored video about her to spread? They can't use the Napster defense of "we're only a platform."

  • Despegar 5 years ago

    Yes they can. Section 230 immunizes them from liability and allows them to moderate (or not) as they please.

  • azangru 5 years ago

    I am confused. Facebook is a person-to-person(s) interaction. If someone wants to show their followers a doctored video, why should Facebook prevent it?

    (The other day, someone on Reddit reminded that in Russia it’s illegal to repost the Putin gay clown meme [1]. And of course there is the famous Chinese ban of Winnie-the-Pooh images. Is this in any way worse than removing the doctored video of Nancy Pelosi would have been?)

    [1] https://www.washingtonpost.com/news/worldviews/wp/2017/04/05...

    • frequentnapper 5 years ago

      There's a difference between satire and downright slander if the viewers can't tell the difference if something is doctored.

      • Macross8299 5 years ago

        An average facebook user being able to ascertain the authenticity of something is a very poor barometer for its authenticity or legality.

        Average FB user probably couldn't tell you if Panama papers were authentic or not, so should those be banned from dissemination as "fake news"? Plenty of people would like to make those go away as slander, I'm sure.

        • sockpuppet999 5 years ago

          Please. Don't fall into the trap of underestimating others intelligence. Even that typical Facebook user from another state has valid reasons and thier opinion or values are no better than my( or Yours) I'm not trying to argue one bit okay? But how exactly do you know the Panama papers are legitimate? Do you have an inside source or did you hear about this issue the same way I did? I agree with you that it fits perfectly into my own world view and I caution you and myself to not take stuff like that as a golden source of true info. No matter how much I'd like to think it's all true, in this day and age one must be more careful about accepting sources of info as gods honest truth because there are folks out there who spend thier lives,fortunes and passions just to mislead or misinform allegedly informed ppl like me( and you)

      • sockpuppet999 5 years ago

        Isn't it up to each viewer to determine the difference. Having some authority telling me what is or isn't real is not a world I'd choose the participate in. Let the user decide for themselves- one person's propaganda == another person's trusted source

        • frequentnapper 5 years ago

          This is where intent comes in. That's why we have judges and courtrooms and laws against slander and libel. Because those things can do real harm.

  • tacosx 5 years ago

    They have a very strong defense thanks to the CDA and DMCA act which these large tech platforms wrote. They wrote themselves blanket immunity for nearly any type of content, in any type of context.

    https://www.copyrightcontentplatforms.com/2019/03/the-scope-...

    • rlt 5 years ago

      Facebook and Google didn't even exist when CDA and DMCA were passed.

tarcyanm 5 years ago

The correct way to counter disinformation is with information. This does assume a sufficiently aware populace able to process its own confirmation bias (which is actually a learnable skill). Sometimes it takes incrementally more information and that's okay.

Widespread availability of high quality media capture and technologies such as facial manipulation will inevitably lead to a future with many, many unverifiable snippets that go viral. This video is unique in that verification is possible, but the future will likely hold far more unverifiable content than otherwise. It is more positive to expect that humans will gain the ability to balance disinformation in the face of further information than to expect that benign dictators will censor perfect knowledge into cognisance.