Today's links
- Supreme Court saves artists from AI: Just because you're on their side, it doesn't mean they're on your side.
- Hey look at this: Delights to delectate.
- Object permanence: KKK x D&D; Martian creativity; Scott Walker's capital ringers; UK v adblocking; Shitty jihadi opsec.
- Upcoming appearances: Where to find me.
- Recent appearances: Where I've been.
- Latest books: You keep readin' em, I'll keep writin' 'em.
- Upcoming books: Like I said, I'll keep writin' 'em.
- Colophon: All the rest.
Supreme Court saves artists from AI (permalink)
The Supreme Court has just turned down a petition to hear an appeal in a case that held that AI works can't be copyrighted. By turning down the appeal, the Supreme Court took a massively consequential step to protect creative workers' interests:
https://www.theverge.com/policy/887678/supreme-court-ai-art-copyright
At the core of the dispute is a bedrock of copyright law: that copyright is for humans, and humans alone. In legal/technical terms, "copyright inheres at the moment of fixation of a work of human creativity." Most people – even people who work with copyright every day – have not heard it put in those terms. Nevertheless, it is the foundation of international copyright law, and copyright in the USA.
Here's what it means, in plain English:
a) When a human being,
b) does something creative; and
c) that creative act results in a physical record; then
d) a new copyright springs into existence.
For d) to happen, a), b) and c) all have to happen first. All three steps for copyright have been hotly contested over the years. Remember the "monkey selfie," in which a photographer argued that he was entitled to the copyright after a monkey pointed a camera at itself and pressed the shutter button? That image was not copyrightable, because the monkey was a monkey, not a human, and copyright is only for humans:
https://en.wikipedia.org/wiki/Monkey_selfie_copyright_dispute
Then there's b), "doing something creative." Copyright only applies to creative work, not work itself. It doesn't matter how hard you labor over a piece of "IP" – if that work isn't creative, there's no copyright. For example, you can spend a fortune creating a phone directory, and you will get no copyright in the resulting work, meaning anyone can copy and sell it:
https://en.wikipedia.org/wiki/Feist_Publications,_Inc._v._Rural_Telephone_Service_Co.
If you mix a little creative labor with the hard work, you can get a little copyright. A directory of "all the phone numbers for cool people" can get a "thin" copyright over the arrangement of facts, but such a copyright still leaves space for competitors to make many uses of that work without your permission:
https://pluralistic.net/2021/08/14/angels-and-demons/#owning-culture
Finally, there's c): copyright is for tangible things, not intangibles. Part of the reason choreographers created a notation system for dance moves is that the moves themselves aren't copyrightable:
https://en.wikipedia.org/wiki/Dance_notation
The non-copyrightability of movement is (partly) why the noted sex-pest and millionaire grifter Bikram Choudhury was blocked from claiming copyright on ancient yoga poses (the other reason is that they are ancient!):
https://en.wikipedia.org/wiki/Copyright_claims_on_Bikram_Yoga
Now, AI-generated works are certainly tangible (any work by an AI must involve magnetic traces on digital storage media). The prompts for an AI output can be creative and thus copyrightable (in the same way that notes to a writers' room or from an art-director are). But the output from the AI cannot be copyrighted, because it is not a work of human authorship.
This has been the position of the US Copyright Office from the start, when AI prompters started sending in AI-generated works and seeking to register copyrights in them. Stephen Thaler, a computer scientist who had prompted an image generator to produce a bitmap, kept appealing the Copyright Office's decision, seemingly without regard to the plain facts of the case and the well-established limits of copyright. By attempting to appeal his case all the way to the Supreme Court, Thaler has done every human artist a huge boon: his weak, ill-conceived case was easy for the Supreme Court to reject, and in so doing, the court has cemented the non-copyrightability of AI works in America.
You may have heard the saying, "Hard cases make bad law." Sometimes, there are edge-cases where following the law would result in a bad outcome (think of a Fourth Amendment challenge to an illegal search that lets a murderer go free). In these cases, judges are tempted to interpret the law in ways that distort its principles, and in so doing, create a bad precedent (the evidence from a bad search is permitted, and so cops stop bothering to get a warrant before searching people).
This is one of the rare instances in which a bad case made good law. Thaler's case wasn't even close – it was an absolute loser from the jump. Normally, plaintiffs give up after being shot down by an agency like the Copyright Office or by a lower court. But not Thaler – he stuck with it all the way to the highest court in the land, bringing clarity to an issue that might have otherwise remained blurry and ill-defined for years.
This is wonderful news for creative workers. It means that our bosses must pay humans to do work if they want to be granted copyright on the things they want to sell. The more that humans are involved in the creation of a work, the stronger the copyright on that work becomes – which means that the less a human contributes to a creative work, the harder it will be to prevent others from simply taking it and selling it or giving it away.
This is so important. Our bosses do not want to pay us. When our bosses sue AI companies, it's not because they want to make sure we get paid.
The many pending lawsuits – from news organizations like the New York Times, wholesalers like Getty Images, and entertainment empires like Disney – all seek to establish that training an AI model is a copyright infringement. This is wrong as a technical matter: copyright clearly permits making transient copies of published works for the purpose of factual analysis (otherwise every search engine would be illegal). Copyright also permits performing mathematical analysis on those transient copies. Finally, copyright permits the publication of literary works (including software programs) that embed facts about copyrighted works – even billions of works:
https://pluralistic.net/2023/09/17/how-to-think-about-scraping/
Sure, you can infringe copyright with an AI model – say, by prompting it to produce infringing images. But the mere fact that a technology can be used to infringe copyright doesn't make the technology itself infringing (otherwise every printing press, camera, and computer would be illegal):
https://en.wikipedia.org/wiki/Sony_Corp._of_America_v._Universal_City_Studios,_Inc.
Of course, the fact that copyright currently permits training models doesn't mean that it must. Copyright didn't come down from a mountain on two stone tablets. It's just a law, and laws can be amended. I think that amending copyright to ban training a model would inflict substantial collateral damage on everything from search engines to scholarship, but perhaps you disagree. Maybe you think that you could wordsmith a new copyright law that bans training without whacking a bunch of socially beneficial activities.
Even if that's so, it still wouldn't help artists.
To understand why, consider Universal and Disney's lawsuit against Midjourney. The day that lawsuit dropped, I got a press release from the RIAA, signed by its CEO, Mitch Glazier. Here's how it began:
There is a clear path forward through partnerships that both further AI innovation and foster human artistry. Unfortunately, some bad actors – like Midjourney – see only a zero-sum, winner-take-all game.
The RIAA represents record labels, not film studios, but thanks to vertical integration, the big film studios are also the big record labels. That's why the RIAA alerted the press to its position on this suit.
There's two important things to note about the RIAA press release: how it opened, and how it closed. It opens by stating that the companies involved want "partnerships" with AI companies. In other words, if they establish that they have the right to control training on their archives, they won't use that right to prevent the creation of AI models that compete with creative workers. Rather, they will use that right to get paid when those models are created.
Expanding copyright to cover models isn't about preventing generative AI technologies – it's about ensuring that these technologies are licensed by incumbent media companies. This licensure would ensure that media companies would get paid for training, but it would also let them set the terms on which the resulting models were used. The studios could demand that AI companies put "guardrails" on the resulting models to stop them from being used to output things that might compete with the studios' own products.
That's what the opening of this press-release signifies, but to really understand its true meaning, you have to look at the closing of the release: the signature at the bottom of it, "Mitch Glazier, CEO, RIAA."
Who is Mitch Glazier? Well, he used to be a Congressional staffer. He was the guy responsible for sneaking a clause into an unrelated bill that repealed "termination of transfer" for musicians. "Termination" is a part of copyright law that lets creators take back their rights after 35 years, even if they originally signed a contract for a "perpetual license."
Under termination, all kinds of creative workers who got royally screwed at the start of their careers were able to get their copyrights back and re-sell them. The primary beneficiaries of termination are musicians, who signed notoriously shitty contracts in the 1950s-1980s:
https://pluralistic.net/2021/09/26/take-it-back/
When Mitch Glazier snuck a termination-destroying clause into legislation, he set the stage for the poorest, most abused, most admired musicians in recording history to lose access to money that let them buy a couple bags of groceries and make the rent. He condemned these beloved musicians to poverty.
What happened next is something of a Smurfs Family Christmas miracle. Musicians were so outraged by this ripoff, and their fans were so outraged on their behalf, that Congress convened a special session solely to repeal the clause that Mitch Glazier tricked them into voting for. Shortly thereafter, Glazier was out of Congress:
https://en.wikipedia.org/wiki/Mitch_Glazier
But this story has a happy ending for Glazier, too – he might have been out of his government job, but he had a new gig, as CEO of the Recording Industry Association of America, where he earns more than $1.3 million/year to carry on the work he did in Congress – serving the interests of the record labels:
https://projects.propublica.org/nonprofits/organizations/131669037
Mitch Glazier serves the interests of the labels, not musicians. He can't serve both interests, because every dime a musician takes home is a dime that the labels don't get to realize as profits. Labels and musicians are class enemies. The fact that many musicians are on the labels' side when they sue AI companies does not mean that the labels are on the musicians' side.
What will the media companies do if they win their lawsuits? Glazier gives us the answer in the opening sentence of his press release: they will create "partnerships" with AI companies to train models on the work we produce.
This is the lesson of the past 40 years of copyright expansion. For 40 years, we have expanded copyright in every way: copyright lasts longer, covers more works, prohibits more uses without licenses, establishes higher penalties, and makes it easier to win those penalties.
Today, the media industry is larger and more profitable than at any time, and the share of those profits that artists take home is smaller than ever.
How has the expansion of copyright led to media companies getting richer and artists getting poorer? That's the question that Rebecca Giblin and I answer in our 2022 book Chokepoint Capitalism. In a nutshell: in a world of five publishers, four studios, three labels, two app companies and one company that controls all ebooks and audiobooks, giving a creative worker more copyright is like giving your bullied kid extra lunch money. It doesn't matter how much lunch money you give that kid – the bullies will take it all, and the kid will go hungry:
https://pluralistic.net/2022/08/21/what-is-chokepoint-capitalism/
Indeed, if you keep giving that kid more lunch money, the bullies will eventually have enough dough that they'll hire a fancy ad-agency to blitz the world with a campaign insisting that our schoolkids are all going hungry and need even more lunch money (they'll take that money, too).
When Mitch Glazier – who got a $1m+/year job for the labels after attempting to pauperize musicans – writes on behalf of Disney in support of a copyright suit to establish that copyright prevents training a model without a license, he's not defending creative workers. Disney, after all, is the company that takes the position that if it buys another company, like Lucasfilm or Fox, that it only acquires the right to use the works we made for those companies, but not the obligation to pay us when they do:
https://pluralistic.net/2021/04/29/writers-must-be-paid/#pay-the-writer
If a new, unambiguous copyright over model training comes into existence – whether through a court precedent or a new law – then all our contracts will be amended to non-negotiably require us to assign that right to our bosses. And our bosses will enter into "partnerships" to train models on our works. And those models will exist for one purpose: to let them create works without paying us.
The market concentration that lets our bosses dictate terms to us is getting much worse, and it's only speeding up. Getty Images – who sued Stability AI over image generation – is merging with Shutterstock:
https://globalcompetitionreview.com/gcr-usa/article/photographers-alarmed-gettyshutterstock-merger
And Paramount is merging with Warners:
https://pluralistic.net/2026/02/28/golden-mean/#reality-based-community
This is where this new Supreme Court action comes in. A new copyright that covers training is just one more thing these increasingly powerful members of this increasingly incestuous cartel can force us to sign away. That new copyright isn't something for us to bargain with, it's something we'll bargain away.
But the fact that the works that a model produces are automatically in the public domain is something we can't bargain away. It's a legal fact, not a legal right. It means that the more humans there are involved in the creation of a final work, the more copyrightable that work is.
Media bosses love AI because it dangles the tantalizing possibility of running a business without ego-shattering confrontations with creative workers who know how to do things. It's the solipsistic fantasy of a world without workers, in which a media boss conceives of a "product," prompts a sycophantic AI, and receives an item that's ready for sale:
https://pluralistic.net/2026/01/05/fisher-price-steering-wheel/#billionaire-solipsism
Many bosses know this isn't within reach. They imagine that they'll get the AI to shit out a script and then pay a writer on the cheap to "polish" it. They think they'll get an AI to shit out a motion sequence, a still, or a 3D model and then pay a human artist pennies to put the "final touches" on it. But the Copyright Office's position is that only those human contributions are eligible for a copyright: a few editorial changes, a few pixels or vectors rearranged. Everything else is in the public domain.
Here's the cool part: the only thing our bosses hate more than paying us is when other people take their stuff without paying for it. To achieve the kind of control they demand, they will have to pay us to make creative works.
What's more, the fact that AI-generated works are in the public domain leaves a lot of uses that don't harm creative workers intact. You can amuse yourself and your friends with all the AI slop you can generate; the fact that it's not copyrightable doesn't matter to that use. I happen to think AI "art" is shit, but you do you:
https://pluralistic.net/2024/05/13/spooky-action-at-a-close-up/#invisible-hand
This also means that if you're a writer who likes to brainstorm with a chatbot as you develop an idea, that's fine, so long as the AI's words don't end up in the final product. Creative workers already assemble "mood boards" and clippings for inspiration – so long as these aren't incorporated into the final work, that's fine.
That's just what the Hollywood writers bargained for in their historic strike over AI. They retained the right to use AI if they wanted to, but their bosses couldn't force them to:
https://pluralistic.net/2023/10/01/how-the-writers-guild-sunk-ais-ship/
The Writers Guild were able to bargain with the heavily concentrated studios because they are organized in a union. Not just any union, either: the Writers Guild (along with the other Hollywood unions) are able to undertake "sectoral bargaining" – that's when a union can negotiate a contract with all the employers in a sector at once.
Sectoral bargaining was once the standard for labor relations, but it was outlawed in the 1947 Taft-Hartley Act, which clawed back many of the important labor rights established with the New Deal's National Labor Relations Act. To get Taft-Hartley through Congress, its authors had to compromise by grandfathering in the powerful Hollywood unions, who retained their right to sectoral bargaining. More than 75 years later, that sectoral bargaining right is still protecting those workers.
Our bosses tell us that we should side with them in demanding a new law: a copyright law that covers training an AI model. The mere fact that our bosses want this should set off alarm bells. Just because we're on their side, it doesn't mean they're on our side. They are not.
If we're going to use our muscle to fight for a new law, let it be a sectoral bargaining law – one that covers all workers. You can tell that this would be good for us because our bosses would hate it, and every other worker in America would love it. The Writers Guild used sectoral bargaining to achieve something that 40 years of copyright expansion failed at: it made creative workers richer, rather than giving us another way to be angry about how our work is being used.
(Image: Cryteria, CC BY 3.0, modified)
Hey look at this (permalink)

- Preface to Designing Secure Software: A Guide for Developers https://designingsecuresoftware.com/text/ch0-preface/
-
Publish Your Threat Models! https://arxiv.org/pdf/2511.08295
-
What You Won’t See at the Live Nation–Ticketmaster Trial https://prospect.org/2026/03/02/justice-department-live-nation-ticketmaster-antitrust-trial/
-
Why I'm running to be Director General of the BBC https://www.absurdintelligence.com/why-im-running-to-be-director-general-of-the-bbc/
Object permanence (permalink)
#20yrsago Cornell University harasses maker of Cornell blog https://web.archive.org/web/20060621110535/http://cornell.elliottback.com/archives/2006/03/02/cornell-university-nastygram/
#15yrsago Explaining creativity to a Martian https://locusmag.com/feature/cory-doctorow-explaining-creativity-to-a-martian/
#15yrsago Scott Walker smuggles ringers into the capital for the legislative session https://www.theawl.com/2011/03/in-madison-scott-walker-packed-his-budget-address-with-ringers/
#15yrsago Measuring radio’s penetration in 1936 https://www.flickr.com/photos/70118259@N00/albums/72157626051208969/with/5490099786
#10yrsago Rube Goldberg musical instrument that runs on 2,000 steel ball-bearings https://www.youtube.com/watch?v=IvUU8joBb1Q
#10yrsago KKK vs D&D: the surprising, high fantasy vocabulary of racism https://en.wikipedia.org/wiki/Ku_Klux_Klan_titles_and_vocabulary
#10yrsago UK minister compares adblocking to piracy, promises action https://www.theguardian.com/media/2016/mar/02/adblocking-protection-racket-john-whittingdale
#10yrsago Some ad-blockers are tracking you, shaking down publishers, and showing you ads https://www.wired.com/2016/03/heres-how-that-adblocker-youre-using-makes-money/
#10yrsago ISIS opsec: jihadi tech bureau recommends non-US crypto tools https://web.archive.org/web/20160303095904/http://www.dailydot.com/politics/isis-apple-fbi-congressional-hearing-crypto-international/
#10yrsago Apple v FBI isn’t about security vs privacy; it’s about America’s security vs FBI surveillance https://www.wired.com/2016/03/feds-let-cyber-world-burn-lets-put-fire/
Upcoming appearances (permalink)

- Victoria: 28th Annual Victoria International Privacy & Security Summit, Mar 3-5
https://www.rebootcommunications.com/event/vipss2026/ -
Victoria: Enshittification at Russell Books, Mar 4
https://www.eventbrite.ca/e/cory-doctorow-is-coming-to-victoria-tickets-1982091125914 -
Barcelona: Enshittification with Simona Levi/Xnet (Llibreria Finestres), Mar 20
https://www.llibreriafinestres.com/evento/cory-doctorow/ -
Berkeley: Bioneers keynote, Mar 27
https://conference.bioneers.org/ -
Montreal: Bronfman Lecture (McGill) Apr 10
https://www.eventbrite.ca/e/artificial-intelligence-the-ultimate-disrupter-tickets-1982706623885 -
Berlin: Re:publica, May 18-20
https://re-publica.com/de/news/rp26-sprecher-cory-doctorow -
Berlin: Enshittification at Otherland Books, May 19
https://www.otherland-berlin.de/de/event-details/cory-doctorow.html -
Hay-on-Wye: HowTheLightGetsIn, May 22-25
https://howthelightgetsin.org/festivals/hay/big-ideas-2
Recent appearances (permalink)
- Tanner Humanities Lecture (U Utah)
https://www.youtube.com/watch?v=i6Yf1nSyekI -
The Lost Cause
https://streets.mn/2026/03/02/book-club-the-lost-cause/ -
Should Democrats Make A Nuremberg Caucus? (Make It Make Sense)
https://www.youtube.com/watch?v=MWxKrnNfrlo -
Making The Internet Suck Less (Thinking With Mitch Joel)
https://www.sixpixels.com/podcast/archives/making-the-internet-suck-less-with-cory-doctorow-twmj-1024/ -
Panopticon :3 (Trashfuture)
https://www.patreon.com/posts/panopticon-3-150395435
Latest books (permalink)
- "Canny Valley": A limited edition collection of the collages I create for Pluralistic, self-published, September 2025 https://pluralistic.net/2025/09/04/illustrious/#chairman-bruce
-
"Enshittification: Why Everything Suddenly Got Worse and What to Do About It," Farrar, Straus, Giroux, October 7 2025
https://us.macmillan.com/books/9780374619329/enshittification/ -
"Picks and Shovels": a sequel to "Red Team Blues," about the heroic era of the PC, Tor Books (US), Head of Zeus (UK), February 2025 (https://us.macmillan.com/books/9781250865908/picksandshovels).
-
"The Bezzle": a sequel to "Red Team Blues," about prison-tech and other grifts, Tor Books (US), Head of Zeus (UK), February 2024 (thebezzle.org).
-
"The Lost Cause:" a solarpunk novel of hope in the climate emergency, Tor Books (US), Head of Zeus (UK), November 2023 (http://lost-cause.org).
-
"The Internet Con": A nonfiction book about interoperability and Big Tech (Verso) September 2023 (http://seizethemeansofcomputation.org). Signed copies at Book Soup (https://www.booksoup.com/book/9781804291245).
-
"Red Team Blues": "A grabby, compulsive thriller that will leave you knowing more about how the world works than you did before." Tor Books http://redteamblues.com.
-
"Chokepoint Capitalism: How to Beat Big Tech, Tame Big Content, and Get Artists Paid, with Rebecca Giblin", on how to unrig the markets for creative labor, Beacon Press/Scribe 2022 https://chokepointcapitalism.com
Upcoming books (permalink)
- "The Reverse-Centaur's Guide to AI," a short book about being a better AI critic, Farrar, Straus and Giroux, June 2026
-
"Enshittification, Why Everything Suddenly Got Worse and What to Do About It" (the graphic novel), Firstsecond, 2026
-
"The Post-American Internet," a geopolitical sequel of sorts to Enshittification, Farrar, Straus and Giroux, 2027
-
"Unauthorized Bread": a middle-grades graphic novel adapted from my novella about refugees, toasters and DRM, FirstSecond, 2027
-
"The Memex Method," Farrar, Straus, Giroux, 2027
Colophon (permalink)
Today's top sources:
Currently writing: "The Post-American Internet," a sequel to "Enshittification," about the better world the rest of us get to have now that Trump has torched America (1020 words today, 41284 total)
- "The Reverse Centaur's Guide to AI," a short book for Farrar, Straus and Giroux about being an effective AI critic. LEGAL REVIEW AND COPYEDIT COMPLETE.
-
"The Post-American Internet," a short book about internet policy in the age of Trumpism. PLANNING.
-
A Little Brother short story about DIY insulin PLANNING

This work – excluding any serialized fiction – is licensed under a Creative Commons Attribution 4.0 license. That means you can use it any way you like, including commercially, provided that you attribute it to me, Cory Doctorow, and include a link to pluralistic.net.
https://creativecommons.org/licenses/by/4.0/
Quotations and images are not included in this license; they are included either under a limitation or exception to copyright, or on the basis of a separate license. Please exercise caution.
How to get Pluralistic:
Blog (no ads, tracking, or data-collection):
Newsletter (no ads, tracking, or data-collection):
https://pluralistic.net/plura-list
Mastodon (no ads, tracking, or data-collection):
Medium (no ads, paywalled):
Twitter (mass-scale, unrestricted, third-party surveillance and advertising):
Tumblr (mass-scale, unrestricted, third-party surveillance and advertising):
https://mostlysignssomeportents.tumblr.com/tagged/pluralistic
"When life gives you SARS, you make sarsaparilla" -Joey "Accordion Guy" DeVilla
READ CAREFULLY: By reading this, you agree, on behalf of your employer, to release me from all obligations and waivers arising from any and all NON-NEGOTIATED agreements, licenses, terms-of-service, shrinkwrap, clickwrap, browsewrap, confidentiality, non-disclosure, non-compete and acceptable use policies ("BOGUS AGREEMENTS") that I have entered into with your employer, its partners, licensors, agents and assigns, in perpetuity, without prejudice to my ongoing rights and privileges. You further represent that you have the authority to release me from any BOGUS AGREEMENTS on behalf of your employer.
ISSN: 3066-764X






