nonwifehaver3 5 years ago

The dragon with its hoard of gold (private user data) is a great description of Facebook and Google's regulatory goals. As Maciej has also put it, "privacy is an essential right — your most intimate moments should be kept strictly between you and Google".

The terror that could be unleashed by their already existing hoard is almost unimaginable. For all the concern about rising authoritarian politics, few have figured out how bad it would be for the future equivalent of the Gestapo or the Red Guard (pick your country) to get a hold of a decades long dump of medical and financial data, metadata on relationships, posting history, browsing and location history, behavioral fingerprints, stylometry, device IDs, reading history, private photos, and on and on. The tech industry's "vow" to not create a Muslim registry for the US government had a touch of absurdity to it, since there are already companies that could plausibly give you a list of almost all Muslims in the US with a few queries, along with far more personal information than any census or registry has ever asked for.

  • simonebrunozzi 5 years ago

    "Brave New World" is about a coercion that you don't see as malevolent; what you are depicting is more in line with "1984".

    I think the "Brave New World" approach is the most likely to happen / is starting to happen / has been happening for a while now.

  • cm2012 5 years ago

    Google or FB don't matter for this stuff, since the government goes straight to the ISPs (which covers incognito mode, etc)

    • hoseja 5 years ago

      But ISPs have access to much less detailed data. Or do you think encryption is trivially broken by them?

throwanem 5 years ago

"Liberty" might not be completely off the mark. You can get into the habit of looking over your own shoulder all the time, interrogating your every expression and action for what an unsympathetic stranger might make of it and how it might be used against you. And that consciousness doesn't only extend to spaces where you know you're being monitored, because you can't know where and when and how you're being monitored. So you eventually experience it all the time.

  • skybrian 5 years ago

    This isn't intended as a shallow dismissal, but this sounds strikingly similar to the experience of paranoia. Nobody would say that the feeling that the Invisible Other is watching us is a good feeling. But unless something happens, it's also "just" a feeling. Might the feeling be doing more harm than the reality?

    On the other hand, our ancestors would probably wonder at our obsession over the danger of invisible entities getting into our food or water, but germ theory is still real.

    The problem with guarding against invisible dangers, even when they're real, is that they can be hard to distinguish from superstition. It's easy to point at some religious customs and say they are irrational, but is today's folk understanding of nutritional science all that much better? Badly understood science can create especially virulent memes.

    When the harms are hard to demonstrate, privacy disputes sometimes feel similar.

    • throwanem 5 years ago

      It’s funny: to you this reads as paranoia, but if anything I’m drawing on my experience of social anxiety. Which, yes, is irrational, because no one is watching. The point of the discussion at hand is that, in an environment pervaded by automated, recorded invasion of privacy, such feelings may cease to be irrational.

    • 50656E6973 5 years ago

      The harms are not hard to demonstrate, if you study history, or have programming knowledge.

  • Sophistifunk 5 years ago

    Liberty is the best way to describe it. It's not that we're necessarily having any autonomy taken from us right now, but surveillance does increase the power others have over us.

  • m463 5 years ago

    Unfortunately words like "Liberty", "Freedom", "Justice" and others have been ambiguous for so long that they can be twisted on a whim.

    The idea is that two people can be talking about the same thing, both with different ideas (and plausible deniability) in their head.

    Even privacy policies have their own form of ambiguous polite euphemisms, such as "advertising" or "telemetry" or "analytics", which would benefit from clarifying translations.

kennethfriedman 5 years ago

Great points here. We definitely don't have the language to discuss these problems currently. "ambient privacy" is a good start.

"I’ve lost something by the fact of being monitored." is very true, but deniers will say: "what's the something?" and we just don't have the language to explain it yet.

  • tobr 5 years ago

    That’s where the comparison to environmental protection is so powerful. For the longest time, if you would say “We lose something by exploiting natural resources”, people would have just pointed out that there’s a seemingly infinite supply of unexploited nature left.

    We’ve always been monitored in some situations, but never before around the clock, in the bedroom and the bathroom, at the doctor, etc. There’s no way to know what we lose, when we don’t even know how the information will be used. It is a passive blackmail situation. These companies have compromising information about all of us, and we really don’t have any idea who can or will be able to see it.

    • skybrian 5 years ago

      Perhaps ironically, environmental protection often requires sophisticated monitoring of nature. If nobody is watching, you don't know what's lost.

    • hoseja 5 years ago

      Part of it is of course that moral panics and societal attitudes make every human blackmailable. Maybe it would be nice if we could learn to just not care.

AlphaWeaver 5 years ago

The nature analogy is useful, but it's also worth thinking about where this new problem differs.

Nature is a large interconnected system that, in some cases, has the benefit of being self correcting. Population control and food chains are examples of this. I worry that, in the case of privacy, there are far fewer natural "self correcting" aspects. How will this impact our response to this problem?

(My personal prediction is that it will simply make it possible to do more damage before we begin seeking a solution en masse.)

emtel 5 years ago

> The large tech companies point to our willing use of their services as proof that people don’t really care about their privacy. But this is like arguing that inmates are happy to be in jail because they use the prison library.

I used to really enjoy Maciej as a writer, but I've become really bothered by how bad some of his recent arguments are. This is just an amazingly bad analogy. Structurally, it's on the level of the standard libertarian "proof" that income tax is morally equivalent to slavery. Even though there seems to be some analytical validity in both cases, they are obviously bad analogies because slavery is obviously not the same as income tax, and ad tracking networks are obviously not the same as being in prison. This is a cute rhetorical flourish, but nothing more.

The truth is, most people don't care about this stuff. They really just don't. You might want them to, and you might think they would if they had the same understanding of the issues as you do, but as of today, they don't. And I don't see any benefit to the mental gymnastics people go through to avoid accepting this obvious truth.

  • idlewords 5 years ago

    I am glad you used to enjoy my writing!

    I think in this case I may be trying to draw out a narrower point than you think. My argument is that you can't infer consent from how people adapt to circumstances in a situation where they are not given a choice.

    The whole issue of consent in online privacy is fascinating, because I don't think even experts in the field could understand how their data is used (or will be used) enough to give meaningful consent. I certainly couldn't.

    I also agree with you that there is an open empirical question of how much people actually care about this stuff. One way I would like to see it tested is having a legal basis for competitors to sites like Google to give binding privacy guarantees. Then at least we'd be able to put a dollar value (positive or negative) on privacy with a market test.

    • emtel 5 years ago

      > I am glad you used to enjoy my writing!

      And I am sorry for being a bit of a dick!

      > My argument is that you can't infer consent from how people adapt to circumstances in a situation where they are not given a choice.

      I see what you mean, but I don't buy the premise that people don't have a choice, at least not in the same way in which inmates can't choose to leave prison. Technically savvy people can avoid a lot (if perhaps not all) tracking, today, if they care to put in the effort - many don't. The technology for doing this has been productized and is available for purchase to less savvy users, if they choose to buy it - hardly any do.

      I don't mean to suggest that there's a silver bullet that fixes this problem, that you can buy today for $19.99. My point is that when there is real demand for something (e.g. privacy protection), even imperfect solutions succeed in the market and grow in capability over time. Lots of people are attempting to satisfy this demand, duckduckgo being one good example. It looks to me like most of these products are having limited success, which I think is exactly what you'd expect if only a small number of people care about this issue.

  • rtpg 5 years ago

    I think the problem is that you can't tell what people's internal state is from their external acts on certain points.

    I may _want_ to be nice to you, but am bad at expressing myself, so end up saying something mean to you. You can then interpret that as me not wanting to be nice to you. But that is in a sense "double counting" the final result of me saying something mean.

    Similarly, people might use these services despite the privacy qualms. It _could_ be that most people don't care. It could be that people care but are accepting the tradeoff. And it could be that people are _forced_ to accept the tradeoff to participate in civil society (or at least that's their impression of things).

    I feel for this interpretation because I used to not really care about privacy w/r/t Google in particular, and enjoying services from putting my data in their platforms. I still use some of them but over time the privacy thing (or more specifically the advertising thing) has made me feel more and more frustrated. I still use GMail! But I kinda hate that I ahve to at this point (because I _really need_ emails to be delivered and received consistently).

    Similarly I used FB despite privacy up to a point, and only fairly recently finally got almost all my messenger usage down by going to different platforms. But the day before I did it, I still didn't like the privacy tradeoff! But I liked "being able to talk to friends"

  • jodrellblank 5 years ago

    you might think they would if they had the same understanding of the issues as you do, but as of today, they don't.

    How many ordinary people had a good understanding of the neurological effects of leaded petrol before it got legislated against?

    How many ordinary people have a good understanding of which furniture padding materials give off noxious fumes when burned before that was legislated about?

    How many ordinary people understood Thalidomide or Vioxx - or had even heard of them - before they got removed from the market?

    Why is “it has to get to the level where everyone understands it before experts can recommend action” any more valid regarding tech surveillance?

    • emtel 5 years ago

      I never said it did, nor do I think that. I said people don't care.

      If we want legal privacy protections even though people don't care, that's a reasonable thing to want. What I'm reacting against is the claim I see so often which is basically "people actually care about this but they have no choice". I think people care much less, yet have much more choice than is normally claimed.

  • pseudalopex 5 years ago

    Facebook and Google track you even if you don't use their services.

kzrdude 5 years ago

More choice is misleading, we need to get more freedom by raising the bar of what is protected, and this minimum should not be possible to negotiate away.

komali2 5 years ago

I always wondered why Google or Facebook or hell Comcast didn't start just Owning senators - "Mr. Senator, we want to be the fiber contractor for the new Vet building. Here's pages of logs of you attempting to learn how to use the darknet to access child pornography."

Any anti-choice politician whose mistress or daughter has had an abortion, some tech company probably knows about it.

Any anti-LGBT politician that's actually gay, Google, Facebook, hell maybe even Grindr is aware.

Any White Knight with a heinous secret fetish, Pornhub knows.

People were able to map out military bases using smartwatch GPS data. Imagine if you had straight up access to the databases of this information.

I guess individuals at these companies are caught within rigid corporate power structures? Maybe the internal tooling prevents that sort of thing (didn't prevent someone from deleting Trump's twitter feed that one time), or just rule of law in the USA is still too strong and the risk of being annihilated by the legal system is too great.

Still, I bet the temptation is strong.

  • nostrademons 5 years ago

    It's unprofitable. The cost to user-trust (and hence future data collection and revenues off that data) is more than they stand to gain from blackmailing any one person - even a U.S. Senator or President, or Chinese Premier. So they don't. The only thing that would potentially justify this from a corporate strategy perspective would be an existential threat. Politicians in question know this, and so they don't bother trying to get rid of Google or Facebook, only reign them in enough to please their constituents.

    When I was working at Google (which was close to a decade ago now, before tech overreach became a household buzzword) I'd say "People regularly underestimate how much data Google has on them and overestimate how interesting they are." Everybody's first fear is that Google's going to look up their search history and blackmail them with their porn fetishes. They never stop to think that their porn fetishes, no matter how hardcore, are boring, and shared by millions of other people. As a Google engineer you get desensitized very quickly to the fact that 10% of search queries are porn-seeking (17% on mobile), and looking at other people's kinky pasttimes is about the very last thing you would want to be doing.

    Similarly, it's significantly more profitable to advertise to people than it is to kill them. Every dead person is one less potential customer in the global economy.

    • meruru 5 years ago

      This is a point I've made repeatedly on discussions about the value of privacy. It's not about protecting the fact you're gay or have some weird fetish or cheated on your wife or whatever. That stuff doesn't matter in the slightest. The important thing is that the establishment shouldn't be able to quickly pinpoint and disable every potential whistleblower and every other kind of threat to the establishment itself.

      • TheSpiceIsLife 5 years ago

        I’d argue both things are a concern.

        Corporo-government overreach is a concern for obvious reasons.

        But it’s also concerning that any individual rogue employee could have malicious intent toward you for no other reason than you have something you might not want publicly known.

        • nostrademons 5 years ago

          Outside of certain highly-privilege departments like high-level SREs (because their job description requires them to have root access on the boxes), individual rogue employees do not have this power. There are various access controls that prevent individual engineers from looking up data by PII, other than their own corporate GMail account or accounts they've been specifically authorized to look into by the account holder (usually for customer support reasons). In general engineers are only running aggregate analysis on a large number of anonymized records, and the logging & user info services enforce this.

          Like I said, individual people are not interesting to Google.

        • meruru 5 years ago

          Sure, but one is comparatively minor vs the biggest threat to democracy and human freedom.

          • TheSpiceIsLife 5 years ago

            That's a good point.

            It could be argued that, for a person being blackmailed right now, one is a very real present threat, while the other is a hypothetical future.

            Fortunately there are enough of us to go round, so we can collectively advocate for protections against both possibilities.

    • nonwifehaver3 5 years ago

      In other words, as long as nobody ever puts ideology, religion, or just power over money, we don't have to worry about what's going to happen with the data.

  • idlewords 5 years ago

    The answer to that is kind of boring and simple—because everyone involved would go to jail for a very long time. No employee would do this to benefit a corporation, because it's not the corporation that is going to be doing a decade or two of Federal time as a result of being caught.

    The problem with trying to extort Congress is not just that they have subpoena and investigatory power, but that the crime is inherently political and will (quite rightly) bring the hammer down on you from the entire apparatus of the state.

    • meruru 5 years ago

      That does not apply to intelligence agencies. They are probably able to wield that kind of power against corporations.

  • skybrian 5 years ago

    Someone tried that on Jeff Bezos and it didn't work out so well. Why would someone comfortably in a position of power take that sort of risk, given the likelihood of it backfiring? For the movie plot to work, they need a good motive.

  • YUMad 5 years ago

    How do you know they haven't?

  • dv_dt 5 years ago

    What can Google or Facebook do that Verizon or Comcast or other traditional telecom couldn't have done already as entities who own the physical mobile and broadband equipment that data is flowing to Google or Facebook already.

    • glangdale 5 years ago

      End-to-end encryption isn't perfect, but it would at least make it considerably harder to track people on someone else's service - especially as compared to the owner of the service. Plus, Google or Facebook can cross-correlate tons of different users behavior rather than narrowly spying on one single link (or several links, but still).

  • hoseja 5 years ago

    That's stepping on CIA's turf and I'd wager that CIA still has the bigger stick when it comes down to the physical world.

  • weq 5 years ago

    This is how capitalism works. At some point, companies need to buy laws to enforce their growth stratergy. This is exactly what Google and Facebook are doing right now. Buying laws. And if people dont think stuff like this isnt part of negotiating, they are not even having the same conversation.

    Real laws get created over lunch. They dont come out of steering committees.

  • noob_slayer 5 years ago

    Where in the USA is rule of law strong?

    • bliteben 5 years ago

      Everywhere, have you personally ever had to bribe a state or federal employee? Justice might not always be equal, or fair (and these are subjective), but the rule of law is strong in the US.

burlesona 5 years ago

Submitted this previously: https://news.ycombinator.com/item?id=20188689

How does HN de-dupe stuff? In the past when I’ve submitted something that was a dupe it took me to the already posted link instead of making a new one. Curious why that didn’t happen here.

Really good article, gave me a lot to think about. I think the nature metaphor is a good one, though it doesn’t fill me with optimism for the future of ambient privacy.

  • dang 5 years ago

    This is in the FAQ: we don't count submissions as dupes if the story hasn't had significant attention yet. In other words, reposts are ok until the article has had a significant discussion, or at least significant upvotes. This is to give good articles more than one chance at getting attention, since it can be pretty random which articles get noticed on /newest.

    In this case, we actually boosted the submission into the second-chance pool (see https://news.ycombinator.com/item?id=11662380), which lobbed it on the front page. The reason we picked this post rather than yours is that it was the first submission of the article. (It doesn't look like it right now, because the timestamp was adjusted to be the resubmission time, but you can tell from the ID in the URL.) In the future we want to do some sort of karma sharing so multiple submitters can get credit, but that's not implemented yet.

    • tobr 5 years ago

      If I may, I would like to request a little more transparency when this happens, as it can be confusing. For example, my comment elsewhere in the thread was written days ago, and I only happened to see that it has bubbled back up with a modified timestamp. When someone replies to a comment that appears to be posted mere hours ago they might expect to get a reply.

      • dang 5 years ago

        I'm not sure how to do it in any way that wouldn't be too complicated. The status quo leads to confusion, which is bad, but other things we've tried have led to more, or to a lot of distracting meta comments.

      • floatingatoll 5 years ago

        Does your comment show in the correct time order in the timeline ("mere hours ago") in your personal comment feed here:

        https://news.ycombinator.com/threads?id=tobr

        Or does it show in the original ("days ago") timeline location way further down than "a few hours ago"?

    • burlesona 5 years ago

      Oh, very cool. That makes total sense, and I agree it’s nice to give articles another chance to get up the page. Thanks for letting me know!