Some of privacy’s thorniest questions
Just say that we “fix” Facebook, making it possible for you to take your data and go to a rival service, one that respects your privacy, pays its taxes, and isn’t bent on enclosing all digital spaces into its pervasive surveillance walled garden.
It’s an idyllic vision, but our problems are just getting started.
When you take a Facebook post with you to a new platform, do you get to take other peoples’ comments, too? On the one hand, it sure feels like “things people said to me about my stuff” is part of “my data,” but at the same time, “things I said to other people about their stuff” is also “my data.” Do you need to get all your friends’ consent before you can take their comments? What if they’ve left Facebook already? What if they’re dead? What if the comment you want to take with you is from your enemy, who left a comment so exquisitely stupid that you want to make sure it’s preserved for all eternity? Do you need your enemy’s permission to preserve a copy of their insults?
These aren’t just good, chewy questions for privacy advocates — questions smart people have been pondering for a long time — they’re also fast becoming a favored talking point of Big Tech, its shills, simps, and lobbyists.
“You can’t make us give people their own data back,” Big Tech says, “because it’s not their data! It’s data whose title is so entangled that we alone are entitled to control it.”
That’s an awfully convenient argument. But it raises an inconvenient question: If this data is so gnarly that no one can hope to untangle it, how did Big Tech come to take possession of it in the first place?
Big Tech has an answer, of course: You agreed.
Everyone agreed! Everyone clicked “I agree.” Everyone saw the notice that said, “By using this service, you agree… .” Everyone tore into the box that said, “By breaking this seal, you agree… .”
You agreed.
Obviously, that’s not how “agreement” works. I can prove it. For more than a decade, every email I’ve sent has ended with this:
READ CAREFULLY. By reading this email, you agree, on behalf of your employer, to release me from all obligations and waivers arising from any and all NON-NEGOTIATED agreements, licenses, terms-of-service, shrinkwrap, clickwrap, browsewrap, confidentiality, non-disclosure, non-compete and acceptable use policies (“BOGUS AGREEMENTS”) that I have entered into with your employer, its partners, licensors, agents and assigns, in perpetuity, without prejudice to my ongoing rights and privileges. You further represent that you have the authority to release me from any BOGUS AGREEMENTS on behalf of your employer.
Apart from a couple of tight-asses on a mailing list, no one has ever acted as if this were an “agreement.” If I owned a store with a sign under the doormat that read, “By entering this store you agree that I’m allowed to come over to your house, wear your underwear, punch your grandmother, make long-distance calls, and eat all the food in your fridge,” it wouldn’t entitle me to do any of these things.
You didn’t agree to anything Facebook did to you. You didn’t agree to anything in any of those “agreements.”
Indeed, there’s a name for these agreements: “Consent theater.”
Consent has had a K-shaped trajectory: On the upward swing is the movement to strong, explicit consent in personal relationships. On the downward slide is “consent” in the digital realm, where “agreement” has been reduced to anything from “clicking a link” to running in the opposite direction shouting, “No, no, no, I do not agree!”
The pretense of consent was key to the creation of the modern tech industry and not just when it comes to privacy. Every tech company that finds itself hauled in front of Congress or at the center of a publicity nightmare starts with consent: “Why should we let you install apps of your own choosing? You consented to giving up your right to a competitive marketplace.”
Imagine if we held commercial consent to the standard demanded of our personal and intimate relationships—consent that was only valid if the party giving it understood the full scope of the activity under discussion, only valid if no coercion or pressure was brought to bear on the consenter, and liable to withdrawal at any time, without penalty.
We don’t really have to imagine it. It’s European law.
The General Data Protection Regulation (GDPR) passed in 2016. It’s a big ol’ gnarly hairball of rules, some good and some bad, and its impact is still being assessed.
One aspect that the industry and EU regulators are still coming to grips with is consent (naturally). In theory, the GDPR demands consent that bears far more resemblance to the intimate, personal consent like campus freshmen attend mandatory workshops on than it does to the gabby, grabby fine-print we’ve been ignoring (to our detriment) since the 1990s.
The GDPR’s rule boils down to this: Before you can collect or retain data from users, you must first obtain their informed, opt-in consent for every use you intend to make of that data.
If you want to collect my data and sell it to a data broker who has 400 customers who plan on making 6,000 uses of it, you are supposed to show me 6,000 dialog boxes (all of which default to “no”) in which you plainly and simply explain each of those 6,000 uses.
You may be rolling your eyes at this. No one is gonna click “no” 6,000 times (and certainly no one is gonna click “yes” 6,000 times). Forcing a website to throw up 6,000 dialog boxes before you can use it will render that website unusable.
Well, yeah, that’s the point. A transaction that only works when the other person doesn’t consent is abuse.
The point of the GDPR is to call tech’s bluff. If you really think all your customers understand and agree to everything you do to them, well, let’s ask them and see if that’s true.
You can’t consent to any use of your data that you don’t understand. You can’t consent to any use of your data you haven’t been informed of. You can’t consent to any use of your data you don’t know about.
How do you get someone’s consent to 6,000 speculative data uses? You don’t. But that means you never did. It means that any business model that required that degree of consent never actually had consent. That’s the point.
And here’s a kicker: The fact that the tech companies made you click “I agree” to all those uses before the GDPR is a tacit admission that they think they need your consent to make those uses. I don’t ask for consent before I take a picture of a cool flower in a park because I don’t need it. I do ask for consent before I include someone’s toddler in that picture because even if I have the legal right to take that picture, I know it’s unethical to do so without consent.
Consent theater is a sociopath’s charter: “Yes, I stabbed you 11 times, but you agreed that I could when you came close enough to read my ‘By reading this sign, you give consent for me to stab you’ sign.”
Companies don’t have good character or bad character; they just have incentives. When product designers, marketers, and finance people gather around a table to plan strategy, the person who pipes up and says, “I don’t think we should do that obviously terrible thing to our users” is easy to shut down. Just point out that doing terrible things have few downsides, provided you “obtain consent” by shoveling a few thousand words more into the “agreement” that no one reads. When the cost of taking people’s data is nil, then why not do it all the time just in case you can figure out a way to “monetize” that data in the future?
One way to understand the GDPR’s consent regime is as an intervention in that product design meeting. The GDPR gives every Jiminy Cricket who objects to abusive practices a rebuttal: “If you want to make five uses of this data, we’ll have to show our users five dialog boxes. We know that we lose a third of our users for each dialog box we make a user click before they get to the product, which means that these five speculative uses you just pulled out of your butt are gonna cost us 98% of our users. Do you really think we can make enough money off the remaining 2% to make it worthwhile?”
The GDPR isn’t the EU’s first crack at ending consent theater. The previous iteration required companies to get your permission before dropping a cookie in your browser cache. That gave rise to an epidemic of idiotic “cookie walls” — a dialog box that you had to click “I agree” on before you could use a site.
This was only possible because the earlier ePrivacy Directive had been watered down by corporate lobbyists, who ensured that strong privacy rights were “balanced” by the ability to trade away those rights by clicking “I agree” once, irrevocably granting “consent” to everything that might be done to you and your data for the rest of time.
The GDPR set out to cure this defect by requiring opt-in dialogs for every data use, and within a year of this coming into effect, the EU’s top court ruled that prechecked “consent” boxes weren’t gonna do the trick.
Now, a law is only as good as the state’s willingness to enforce it. I’m no lawyer, but I have used Big Tech’s products while in the EU, and there is no way they are in compliance with the GDPR. So far, EU enforcers haven’t held them to account for this.
Think back to Facebook’s anti-data-portability argument: If we let you take other peoples’ comments on your posts along with the posts yourselves, if we let you take your address book and your private messages and all that other stuff that other people created, we will be letting you invade someone else’s privacy. Only we, the platforms, are legitimately able to hold and use and display and share all of this data. We are able to do this because we obtained consent.
They didn’t obtain consent.
Do tech companies need consent to do all that stuff? I’m not talking about mining your data or showing you ads, I’m talking about really obvious, basic functionality. Like, if you use my messenger service to send a message, do I need your consent to deliver it? Do I need your recipient’s consent? If I offer you a drink, you don’t need to ask my consent before you drink it, do you?
That’s the other side of consent theater. Tech companies don’t just seek your blanket consent to keep themselves out of trouble — ironically, they also claim your consent to do normal things that no one’s ever asked for consent to do before so they can change the social rules so you need their permission to do anything.
The tech companies say they can get your blanket consent to do anything and everything just by waving an “I agree” statement under your nose — but if you actually read that “agreement,” you’ll find that if you want to do anything with the company’s data, use its trademarks, copy its source code, or even sue the company, you must first get its explicit, written, negotiated, for-real consent.
The privacy rules among friends (and enemies) are a complicated mix of law and norms. Forwarding someone else’s confidential email is generally a terrible thing to do but not if the “someone” is your boss and the “confidential email” is a vicious act of sexual harassment. Taking your address book with you when you leave Facebook is totally legit—unless one of the entries in that address book is someone you’ve been stalking, and Facebook helpfully kept their details current as they moved around trying to avoid you.
Personal information is complicated. It’s nothing like property. That doesn’t make it not valuable. The most valuable things we have are nothing like property. Take babies. We have a whole bunch of laws about what you can and can’t do with a baby without treating the baby as anyone’s property. We don’t exclude babies from property relations because babies aren’t valuable enough to treat as property. We exclude babies from property relations because they’re too valuable to treat as property.
Figuring out what the rules are for personal information among personal relationships is going to require a lot of work. Consent — real consent — will be a part of it. But some things don’t require consent. I don’t need your consent to think about you. I don’t need your consent to mention you to a mutual friend (except when I do, because of some prior agreement between you and me—it’s complicated).
There’s an old joke from Ireland whose punchline is “If I wanted to get there, I wouldn’t start from here.” If I wanted to figure out what the rules should be for managing information with overlapping claims like address-book entries and message-board posts, I wouldn’t start by first piling up trillions of these inside the silos of privacy-invading consent-theater impresarios with billions of dollars and a longstanding culture of both demolishing consent (when it’s them taking our data) and reifying it (when it’s us taking their data).
But if we’re going to let the platforms continue to treat our data as theirs thanks to the fiction of consent while we figure this stuff out, it feels like a serious goal to also give them the ultimate say over how we negotiate those rules among ourselves.
Image:
Cryteria (modified)
https://commons.wikimedia.org/wiki/File:HAL9000.svg