Pluralistic: 18 Nov 2020

Today's links

Race, surveillance and tech (permalink)

Today, on the Attack Surface Lectures – a series of 8 panels at 8 indie bookstores that Tor Books and I ran to launch the third Little Brother novel in Oct: Race, Surveillance, and Tech with Meredith Whittaker and Malkia Devich-Cyril, hosted by The Booksmith.

You can also watch this without Youtube surveillance on the Internet Archive:

or listen to the audio as an MP3:

Earlier instalments in the series:

I. Politics and Protest (with Eva Galperin and Ron Deibert, hosted by The Strand):

II. Cross-Media Sci-Fi (with Amber Benson and John Rogers, hosted by the Brookline Booksmith):

Here's a master post with all the media as it is goes live:

And you can also get this as it's posted on my podcast feed – search for "Cory Doctorow podcast" in your podcatcher or use the RSS:

The Mounties lied about social surveillance (permalink)

When you think of the RCMP, you probably imagine the romantic sight of guys in archaic red brocaded uniforms doing close-order drill on horses while waving Canadian flags.

The reality is that the RCMP is a police force grounded in racial violence and genocide, which has not improved noticeably over the following century, adding dirty tricks, antidemocratic political oppression and domestic surveillance to its portfolio.

The RCMP lie about this. It's not just the official lie of good-guy Mounties patrolling the hinterlands for bandits and American gunrunners – it's a string of ongoing, highly specific, contemporary lies about the force's illegal conduct.

In 2019, The Tyee broke the story of "Project Wide Awake," the Mounties' social media surveillance op. On the record, the Mounties denied it, saying they were only using off-the-shelf commercial analytics tools, not spy gear.


18 months later, after an FOI request and a complaint to the Information Commissioner, The Tyee's got 3,000 pages of internal docs on Project Wide Awake, revealing lawless mass surveillance, sweetheart contracts, and state-sponsored hacking.

The RCMP isn't using off-the-shelf commercial analytics tools to watch social media. They're using Babel X, a tool marketed for law-enforcement, the use of which requires judicial authorisation under Canadian law.

The contracts for Babel X and other spy tools were issued without competitive bidding, on the grounds that the existence of these procurements could compromise the surveillance op.

That may seem anodyne, but consider: the reason the RCMP says it doesn't need a warrant to spy on all Canadians is that it is doing something "ordinary" – that Canadians have no expectation of privacy or due process on social media.

But (as Citizen Lab's Kate Robertson_ points out) its argument for handing out these fat, no-bid contracts to cybermercenaries in secret is that Canadians don't know this stuff is in use, and if they did, they wouldn't like it and would change their behaviour.

When it comes to warrants, in other words, this stuff is ordinary. When it comes to transparency, this stuff is completely extraordinary. You'd think they'd have to pick one, but this is the Mounties. They always get their self-serving rationalisation.

Project Wide Awake encompasses both social media and "darknet" surveillance and casts a wide net; according to the RCMP's docs, they're looking to scoop up "private communications" including those related to "political protests."

The docs reveal that the Mounties bought a Facebook-hacking tool that lets them uncover the identities of private friends' lists. The ability to enumerate the private friends of FB users puts Canadians in jeopardy.

For example, the women who are private friends of a shelter can be unmasked by their violent intimate partners. Once the RCMP learned that a tool exists that puts Canadians' safety at risk, they had a duty to report it and help FB close the hole.

Instead, they bought a license from the tool's developer and used it to hack Facebook.

The Mounties knew they were committing crimes. To hide their operations from social media companies, they used global proxies to disguise the origin of their hacking activities.

They also created social media accounts under false identities, acting in secret, without warrants, and against the policies of the social media platforms.

All of this appears to have been controversial within the RCMP. When it was first under discussion, the RCMP's CIO Pierre Perron blasted it, prompting a flurry of memos about his outrage over the program's goals and methods.

Shortly thereafter, Perron quit and went to Huawei (!), bringing with all his proprietary national security knowledge; Christopher Parsons from Citizen Lab speculates that hiring a top Mountie might have been part of Huawei's charm offensive to win Canadian 5G infrastructure bids.

The controversy didn't end with Perron: in the training docs, Mounties who may have questions about the legality of all this off-the-books spying are advised, "You have zero privacy anyways, get over it."

The Mounties have a long history of authoritarian policing of democratic dissent. During the period of martial law in 1970, the Mounties used the cover of the October Crisis in Quebec to engage in a nationwide wave of burglaries of antiwar, labour and racial justice groups.

The goal of these burglaries was to steal membership lists so that people could be put under surveillance on the basis of their political beliefs.

With that in mind, it can't be a coincidence that the current surveillance op is called Project Wide Awake.

The name comes from an X-Men story arc in which a fascistic police force hunts down mutants and puts them in concentration camps, denying them the most basic of human and civil rights.

Sometimes, it's hard not to say the quiet part out loud, huh?

The Tyee's story by Bryan Carney is a bombshell. For more context, don't miss Cynthia Khoo's thread breaking it down.

Telehealth chickenizes docs (permalink)

The "shitty technology adoption curve" describes the arc of oppressive technology: when you have a manifestly terrible idea, you can't ram it down the throats of rich, powerful people who get to say no. You have to find people whose complaints no one will listen to.

So our worst tech ideas start out with prisoners, asylum seekers and mental patients, spread to children and blue collar workers, and ascend the privilege gradient to the wealthy and powerful as they are normalized and have their roughest corners sanded down.

For example: If you ate your dinner under the unblinking gaze of a networked, remote-monitored video-camera 20 years ago, it was because you were in a supermax prison. Today, it's because you've been unwise enough to buy home cameras from Amazon, Google, or Apple.

I'm skeptical of prediction (fortune tellers are charlatans), but I do believe in leading indicators. If you want a look at your likely techno-oppressive future, just look at how we treat kids, blue-collar workers, and prisoners:

Another important concept: the quantitative fallacy. If you want to do computer work on a complex issue, the qualitative elements are daunting. Say you want to do exposure notification – it's easy to use Bluetooth beacons to tell whether two people are close to each other.

But it's hard to know what's actually going on with the people those beacons represent: are they in adjacent, sealed cars stuck in traffic, or are they college kids attending an eyeball-licking party?

Rather than address chewy, irreducible, hard-to-compute qualitative stuff, quants are prone to just incinerating it, leaving behind a quantitative residue of dubious value, which is nevertheless easy to do computation on.

It's not just contact tracing: think of the urge to reduce fair use (a complex, qualitative doctrine hinging on an artist's intent and the resulting aesthetic effect) into a set of hard and fast rules: if you quote two lines of poetry, you're cool. Three lines? Piracy.

This powers Goodhart's Law: "a measurement becomes a target, then ceases to be a good measurement." Discarding the qualitative leads us to extend the lives of suffering, terminally ill people even when they beg for release: they're living longer!

Connected to this: Chickenization, a term from the poultry industry, describing the misclassification of workers as contractors, allowing employers to shift all the risk onto workers and all the benefits to themselves.

Think of Uber drivers, paying their own insurance, gas, depreciation, etc, but not being able to set their prices, not being able to decline a fare, not being able to form a union, having no guaranteed minimum wage, no disability benefits and no workplace safety guarantees.

Put it all together: the shitty technology curve, the quantitative fallacy, and chickenization, and what do you get?


As Oliver Kharraz writes in Techcrunch, telemedicine is here to stay, and while there are many ways telemedicine could benefit doctors and patients, that's not the system we're getting.

Instead, we're getting doctors-as-commodities, paid for piecework (chickenization), with outcomes measured in patients-per-hour not long-term health (quantitative fallacy).

Doctors are powerful, wealthy white-collar workers, but the pandemic has replaced their working conditions with those of a Pacific Rim outsource call-center worker (shitty technology adoption curve).

An exploited call-center worker who can only fill in online forms – not change policies, make exceptions, or even relay your dissatisfaction – rarely solves your problem. That's not what the system's for – it's there to neutralize your ability to negatively impact profits.

If they end up helping you, it's incidental to containing the risk you present.

Likewise, shareholder-first telehealth isn't designed to make you well, it's designed to respond to your immediate complaint and get you off the line.

This medical worst-practice: replacing a personal relationship with a medical professional that develops over time and treats you as a whole person with hastily jotted EHR notes. The literature and the practice of medicine are unequivocal: this doesn't make people well.

Kharraz provides two fixes that are critical to a qualitative, non-shitty, de-chickenized telehealth system:

  • The ability to choose a doc

  • The ability to specify a nearby doc

The fact that these modest goals are out of reach of contemporary telehealth really tells you all you need to know about who it serves.

(Image: Cryteria, CC BY, modified)

Canada's GDPR (permalink)

Yesterday, Canadian Innovation Minister Navdeep Bains introduced the Digital Charter Implementation Act, which proposes a national privacy standard for Canada akin to Europe's GDPR.

The law is complex and will undergo many changes, but its two most salient features are:

I. The right to refuse to have your data collected and used; and

II. The right to have your data deleted if you change your mind.

With stiff penalties for companies that don't comply.

The latter is self-explanatory, but the former is really interesting. Since the early days of packaged software, the tech industry has operated on the basis of a fictional consent: "By being stupid enough to be my customer (open this box, click this link, etc), you agree. That I'm allowed to come over to your house, punch your grandmother, make long distance calls, wear your underwear, and eat all the food in your fridge.

"You agreed!"

Once a company decides it can declare that its customers have given consent to non-negotiated, unconscionable contracts, the product-design equilibrium shifts dramatically.

Features that benefit shareholders (but harm customers) get greenlit, with a note to get legal to add more text to the sprawling novella of garbage legalese that no one reads before the new version is released.

Think of the original Ipod, a stevejobsian curve-cornered slab of plastic and chrome, stripped of ornamentation in favor of one button, two ports, and a wheel – whose packaging included the world's shittiest zine: its unreadable terms and conditions.

What the GDPR did, and what the new Canadian rule proposes, is that the fiction of consent must be replaced with true consent. If you want to get permission to do one million things with my data, then you have to ask me one million plain-language, separate questions.

There can't be an "Accept All" button. The default has to be "no" and this can only be changed to "Yes" if I manually toggle it. You can't deny me access if I don't change to a "Yes," so your product needs a million contingencies for how it interacts with me.

If you think about this for half a second, you'll realize that its purpose isn't to allow companies to continue producing the kinds of products you can only field if you can maintain the sham of consent. It is to prohibit those products by raising the bar on consent.

It's the state saying, "You tell us that all the shady stuff is undertaken with consent. OK, let's see if anyone actually consents to this. If not, you gotta cut it out."

The idea is to shift the product-design equilibrium: "If we do this terrible thing, we're going to have to add 15 more consent questions to the onboarding process. We predict that 25% of potential users will bail if we do this.

"What's more, we predict that 85% of the customers who do finish onboarding will say no to five or more of these questions, which means an extra year of development time to ensure compliance with their preferences."

That's the real purpose of these explicit consent rules: the annihilate the fiction of consent and expose the underlying reality – no one has ever agreed to these terms and no rational person ever would.

This day in history (permalink)

#15yrsago Brit backpackers take Indian call-centre jobs

#10yrsago Canadian Heritage Minister inadvertently damns his own copyright bill

#5yrsago CEOs are lucky, tall men

#1yrago Podcast: Jeannette Ng Was Right, John W. Campbell Was a Fascist

#1yrago Beyond antitrust: the anti-monopoly movement and what it stands for

Colophon (permalink)

Today's top sources: Evan Kirstel (, Slashdot (

Currently writing: My next novel, "The Lost Cause," a post-GND novel about truth and reconciliation. Yesterday's progress: 502 words (85254 total).

Currently reading: The Ministry for the Future, Kim Stanley Robinson

Latest podcast: Someone Comes to Town, Someone Leaves Town (part 23)

Upcoming appearances:

Recent appearances:

Latest book:

This work licensed under a Creative Commons Attribution 4.0 license. That means you can use it any way you like, including commercially, provided that you attribute it to me, Cory Doctorow, and include a link to

Quotations and images are not included in this license; they are included either under a limitation or exception to copyright, or on the basis of a separate license. Please exercise caution.

How to get Pluralistic:

Blog (no ads, tracking, or data-collection):

Newsletter (no ads, tracking, or data-collection):

Mastodon (no ads, tracking, or data-collection):

Twitter (mass-scale, unrestricted, third-party surveillance and advertising):

Tumblr (mass-scale, unrestricted, third-party surveillance and advertising):
When life gives you SARS, you make sarsaparilla -Joey "Accordion Guy" DeVilla