Pluralistic: 23 Jun 2022

A Black baby playing with alphabet blocks; the blocks have creepy staring eyes and XML tags on their faces.

Today's links

A line of kindergartners horsing around in a toddler-sized institutional bathroom, looking into the mirror. Out of the mirror is the glaring eye of HAL 9000 from 2001: A Space Odyssey. The kids' reflection is color-inverted, and their reflected, inverted faces are traced with facial recognition geometry lines.

Daycare apps are insecure surveillance dumpster-fires (permalink)

When my EFF colleague Alexis Hancock signed her baby up for daycare, she was told that she had to download a childcare management app – to monitor and specify "feedings, diaper changes, pictures, activities, and which guardian picked-up/dropped-off the child."

This was during the lockdown, and the app was a way to comply with social distancing and contact tracing rules, but it was also designed to help with "separation anxiety of newly enrolled children and their anxious parents."

Alexis wasn't the only EFFer with a newborn encountering these apps. Being a digital privacy and security expert, she and her colleagues started to pick apart these apps and seek dialogue with the companies that made them. They discovered a nightmare of bad security practices, worse privacy practice, and yawning indifference to the digital wellbeing of very small children and their parents.

First of all, there was the matter of account security. When Alexis and co started looking into these apps, they all shared a glaring defect: none of them implemented two-factor authentication, "one of the easiest ways to increase your security."

They contacted Brightwheel, a leading childcare app vendor, who proudly announced that they were rolling out 2FA – and that this would make them the only childcare app to support it. Incredibly, this is true. As Alexis writes for Wired: this is "bullshit."

Assessing whether an app has 2FA or whether it doesn't is easy: you just have to poke around in the settings. But a more comprehensive look at app security requires a more sophisticated investigation. EFF undertook a static analysis and network analysis of childcare apps and turned up some disturbing results.

For example, the Android Tadpoles app exfiltrates a ton of data to Facebook's Graph API, as well as extremely granular device info to (neither company is mentioned in Tadpoles' privacy policy).

Then there's HiMama, which stores user data in Amazon's cloud – and misleadingly labels this practice as "suited to run sensitive government applications and is used by over 300 U.S. government agencies, as well as the Navy, Treasury and NASA" (none of this activity runs on the Amazon cloud that Himama uses – it's on the AWS Govcloud, a completely separate product).

There's an industry-wide gap in disclosure of which data is collected and how it is used; the disclosures they do make are misleading or incomplete.

Worse, the companies have been vastly indifferent to these problems. In "'We may share the number of diaper changes': A Privacy and Security Analysis of Mobile Child Care Applications," a peer-reviewed paper presented at the 2022 Privacy Enhancing Technologies Symposium, a team lays these problems out in eye-watering detail:

Writing for EFF, Hancock makes a series of recommendations to the childcare app industry:

  • 2FA for all admins and staff

  • Address known security vulnerabilities in mobile applications

  • Disclose and list any trackers and analytics and how they are used

  • Use hardened cloud server images. Additionally, a process in place to continuously update out-of-date technology on those servers

  • Lock down any public cloud buckets hosting children’s videos and photos

She also strongly recommends implementing end-to-end encryption between schools and parents: "There’s no need for the service itself to view communication being passed between schools and parents."

The irony here is that all of this is happening in the context of apps, which were sold to us as "curated computing." We were promised that if we ended the practice of software authors providing code to their users, and instead let Apple and Google decide what code we were allowed to run, all the evils of software would go away:

In reality, apps are some of the dirtiest code we use. Muslim call-to-prayer apps harvest their users' data and sell it to ICE and other domestic spy agencies:

Period-tracking apps share their users' sex lives, fertility data, location and other sensitive info to all comers, and will be a bonanza for bounty-hunting forced-birth advocates seeking to turn in people who have abortions for cash rewards:

Why are apps such a consistent dumpster-fire? Well, for one thing, apps have to be built using the app stores' specs, which are billed as imposing rigor on software authors. In reality, the overheads this imposes has driven app makers to use software development kits that sneak privacy-invading data-collection onto users.

Because apps come from "app stores" and not as standalone software, app vendors can "update" their code with new, malicious behaviors and users can't "downgrade" to the earlier, superior versions. For example, when Audacity was taken over by dickheads who announced that the program would soon come with built-in tracking, users responded by announcing that they wouldn't install the new versions, and the company backed down:

That's not how it works for apps. A couple years ago, a trivial app I used to specify Bluetooth priority (so my phone wouldn't connect to my kid's speaker when I walked past her room) was updated to include intrusive adware that popped up ads every time I unlocked the device. Eventually I figured out what was going on and uninstalled the software, but because this was from an app store, I can't roll back to the superior, pre-adware version.

The revelations about bad data-handling in childcare apps are disturbingly predictable. These are the very same bad practices that Elizabeth Warren, Cory Booker and Ron Wyden have raised with mental health apps like Betterhelp and Talkspace:

It's hard to say which is more disturbing: privacy-invading, insecure mental health apps, or privacy-invading, insecure apps that track your toddler's play, sleep, location and diaper changes.

Neither is acceptable.

All of this should be viewed against the backdrop of legislative and regulatory initiatives to force tech giants to give their customers more say over which apps they run, and how. In response, Big Tech companies insist that allowing software developers to directly transact with device owners will expose the public to bad privacy and security practices – insisting, against all evidence, that "mobile" is a synonym for "secure."

One intriguing way out of this mess is by forcing the mobile platforms to fully support Web Apps, or at least to get out of the way developers who want to offer mobile tools to users to make Web Apps fully functional:

A Web App is just what it sounds like: an app that is delivered into your browser, and runs inside of it. The Web App experience could be (but isn't) pretty much identical to installing app store apps: choose your app, click install, grant or refuse permissions, get an icon on your home screen:

But because Web Apps run in browsers, they can be modified by browser plugins – like ad- and tracker-blockers. And because Web Apps are defined by open standards – not by corporate fiat handed down by monopolists whose own products compete with app developers – anyone can make a Web App development toolkit:

Regular software can spy on users and steal their data, too, of course. But turning "programs" into "apps" didn't solve this problem – it just limited users' ability to defend themselves, making them reliant on two companies to decide what protections they deserve.

Cryteria (modified)

CC BY 3.0

German Federal Archives (modified),_Grethen,_Waschraum_im_Kindergarten.jpg

CC BY-SA German 3.0

(Image: Cryteria, CC BY 3.0; German Federal Archives, CC BY-SA 3.0 German; modified)

(Image: EFF, CC BY 3.0)

Hey look at this (permalink)

This day in history (permalink)

#15yrsago Open Source Consortium to regulators: Stop the BBC’s DRM!

#15yrsago Broadcast Treaty wounded and dying!

#15yrsago Podcasting Bruce Sterling's "Hacker Crackdown"

#10yrsago St Colin and the Dragon: great torn paper kids’ comic

#10yrsago EFF joins the defense in Charles Carreon v. The Whole Goddamned Internet

#10yrsago Bruce Sterling interviewed about the New Aesthetic

#10yrsago Top US drug cop can’t tell the difference between marijuana and heroin

#10yrsago Science fiction in Africa

#5yrsago A Chinese vitamin MLM cult is replacing healthcare for poor Ugandans

#5yrsago For the first time, Jeremy Corbyn overtakes Theresa May in UK polls

#5yrsago US Copyright Office recommends sweeping, welcome changes to America’s DRM laws

#5yrsago Crowdfunding a pro-vaccination bus to follow the anti-vaxxer bus

#5yrsago Canada: Trump shows us what happens when “good” politicians demand surveillance powers

#5yrsago A DRM-locked, $400 tea-brewing machine from the Internet of Shit timeline

#5yrsago John Oliver dared a coal exec to sue him, and now he’s being sued

#5yrsago Tumblr is now owned by a phone company, so it’s stopped fighting for Network Neutrality

#5yrsago A history of artist Anish Kapoor and his assholic mission to own the color black

#1yrago Peloton bricks its treadmills

#1yrago Juul's junk science

#1yrago Improving the ACCESS Act: Six ways to make the most important tech law of the legislative season even better

Colophon (permalink)

Currently writing:

  • Some Men Rob You With a Fountain Pen, a Martin Hench noir thriller novel about the prison-tech industry. Friday's progress: 554 words (18464 words total)

  • The Internet Con: How to Seize the Means of Computation, a nonfiction book about interoperability for Verso. Friday's progress: 508 words (15175 words total)

  • Picks and Shovels, a Martin Hench noir thriller about the heroic era of the PC. (92849 words total) – ON PAUSE

  • A Little Brother short story about DIY insulin PLANNING

  • Vigilant, Little Brother short story about remote invigilation. FIRST DRAFT COMPLETE, WAITING FOR EXPERT REVIEW

  • Moral Hazard, a short story for MIT Tech Review's 12 Tomorrows. FIRST DRAFT COMPLETE, ACCEPTED FOR PUBLICATION

  • Spill, a Little Brother short story about pipeline protests. FINAL DRAFT COMPLETE

  • A post-GND utopian novel, "The Lost Cause." FINISHED

  • A cyberpunk noir thriller novel, "Red Team Blues." FINISHED

Currently reading: Analogia by George Dyson.

Latest podcast: Monopolists Want to Create Human Inkjet Printers

Upcoming appearances:

Recent appearances:

Latest book:

Upcoming books:

  • Chokepoint Capitalism: How to Beat Big Tech, Tame Big Content, and Get Artists Paid, with Rebecca Giblin, nonfiction/business/politics, Beacon Press, September 2022

This work licensed under a Creative Commons Attribution 4.0 license. That means you can use it any way you like, including commercially, provided that you attribute it to me, Cory Doctorow, and include a link to

Quotations and images are not included in this license; they are included either under a limitation or exception to copyright, or on the basis of a separate license. Please exercise caution.

How to get Pluralistic:

Blog (no ads, tracking, or data-collection):

Newsletter (no ads, tracking, or data-collection):

Mastodon (no ads, tracking, or data-collection):

Medium (no ads, paywalled):

(Latest Medium column: "Reasonable Agreements: On the Crapification of Literary Contracts"

Twitter (mass-scale, unrestricted, third-party surveillance and advertising):

Tumblr (mass-scale, unrestricted, third-party surveillance and advertising):

"When life gives you SARS, you make sarsaparilla" -Joey "Accordion Guy" DeVilla