Everything Made By an AI Is In the Public Domain
The US Copyright Office offers creative workers a powerful labor protective. Cryteria/CC BY 3.0, modified Last week, a US federal judge handed America’s creative workers a huge labor win: Judge Beryl A Howell of the DC Circuit Court upheld a US Copyright Office ruling that works created by “AIs” are not eligible for copyright protection. This is huge. Some background: under US law — and under a mountain of international treaties, from the Berne Convention to the TRIPS —copyright is automatically granted to creative works of human authorship “at the moment of fixation in some tangible medium.” That is: as soon as a human being makes something creative, and records it in some medium (a hard-drive, magnetic tape, paper, film, canvas, etc), that creative thing is immediately copyrighted (the duration of that copyright varies, both by territory and by whether the creator was working on their own or for a corporation). That means that for a work to be eligible for copyright in the USA, it must satisfy three criteria: It must be creative. Copyright does not apply to non-creative works (say, a phone book listing everyone in a town in alphabetical order), even if the work required a lot of labor. Copyright does not protect effort, it protects creativity. You can spend your whole life making a phone book and get no copyright, but the haiku you toss off in ten seconds while drunk gets copyright’s full protection. It must be tangible. Copyright only applies to creative works that are “fixed in a tangible medium.” A dance isn’t copyrightable, but a video of someone dancing is, as is a written description of the dance in choreographers’ notation. A singer can’t copyright the act of singing, but they can copyright the recording of the song. It must be of human authorship. Only humans are eligible for copyright. A beehive’s combs may be beautiful, but they can’t be copyrighted. An elephant’s paintings may be creative, but they can’t be copyrighted. A monkey’s selfie may be iconic, but it can’t be copyrighted. The works an algorithm generates —be they still images, audio recordings, text, or videos — cannot be copyrighted. For creative workers, this is huge. Our bosses, like all bosses, relish the thought of firing us all and making us homeless. You will never love anything as much as your boss hates paying you. That’s why the most rampant form of theft in America is wage theft. Just the thought of firing workers and replacing them with chatbots is enough to invoke dangerous, persistent priapism in the boardrooms of corporate America. They really want to fire us and replace us with plausible sentence generators and the world’s most derivative imagery. The fact that these apps are objectively terrible at doing our jobs doesn’t enter into their calculations. Bosses believe they have a captive audience, thanks to massive market concentration, and that we will continue to pay for substandard work delivered in worse ways because they’re the only game in town. The most perfect form of American business is the airport snack-bar. The federal government has turned airport security into a multi-hour, multi-stage ordeal, such that every flier needs to arrive at the airport with hours to spare, and then trap themselves on the “safe” side of the TSA checkpoint. Any business on that side of the checkpoint has a license to print money: you can charge $12 for a 300ml bottle of water, or $18 for a saranwrapped cheese sandwich and people will pay for it. What choice do they have? Add the TSA to the airport snack bar and you have the platonic ideal of a public-private partnership, a combination of state intervention and feverish capitalism that no flier can escape. Enshittification requires a captive audience. For executives running entertainment businesses, the temptation to lower quality and raise prices is irresistible. If you’re one of the five publishers, or four studios, or three labels, or two mobile app stores, or the single, solitary company that controls all the ebooks and audiobooks, you can sell garbage at inflated prices and people will buy it…for a while. Our bosses will fire our asses and replace us with software, even if the software sucks at doing our jobs. They do this all the time. Remember when a human being who worked at the business you were calling answered the phone? First, they replaced those workers with contractors on the other side of the world who weren’t allowed to solve your problem, only read from a script. Then they fired those humans and replaced them with “interactive voice response systems” that don’t even pretend to solve our problems, only make us repeat ourselves over and over again: “Seventeen. No. Seventeen. No. Seven. Teen. No. One. Seven. Operator. Representative. Operator. Operator. Operator. Fuuuuuuuuck <click>.” As Lily Tomlin put it, “We don’t have to care: we’re the phone company.” They sincerely believe that we’ll pay to see shitty movies written by a chatbot and performed by video-game sprites. In their ideal world, all creative workers are fired and homeless, and an exec “makes a movie” by typing some prompts into a webform, toddles off to the Smoke House for a tomahawk steak and three gin martinis, then wanders back to their office to retrieve a new feature film from their hard-drive, untouched by human hands. Studio execs love plausible sentence generators because they have a workflow that looks exactly like a writer-exec dynamic, only without any eye-rolling at the stupid “notes” the exec gives the writer. All an exec wants is to bark out “Hey, nerd, make me another E.T., except make the hero a dog, and set it on Mars.” After the writer faithfully produces this script, the exec can say, “OK, put put a love interest in the second act, and give me a big gunfight at the climax,” and the writer dutifully makes the changes. This is exactly how prompting an LLM works. A writer and a studio exec are lost in the desert, dying of thirst. Just as they are about to perish, they come upon an oasis, with a cool sparkling pool of water. The writer drops to their knees and thanks the fates for saving their lives. But then, the studio exec unzips his pants, pulls out his cock and starts pissing in the water. “What the fuck are you doing?” the writer demands. “Don’t worry,” the exec says, “I’m making it better.” There is only one thing our bosses love more than the thought of not paying us: getting copyright on their products. The relentless, 40-year campaign to extend copyright’s duration, scope, and penalties has vastly increased the profitability and revenues of the entertainment sector (even as creative workers’ wages have fallen). Studio execs and other creative industry bosses would rather drink a gallon of warm spit before breakfast, every day for the rest of their lives, than give up a single molecule of copyright. That’s where the US Copyright Office’s rule that AI-created works, just affirmed by a DC Circuit judge, comes in. When faced with the choice of never paying us again, and having no copyright on the things they sell, our bosses will pay us all day long. This is amazing news. For one thing, it means that our brave, tireless creative comrades who’ve been picketing all summer have just gained a gigantic piece of leverage. Until last week, they were negotiating over the use of AI in movies and shows with execs who believed that they might be able to fire us, replace us with software, and still get a copyright over their products. Now, they’re bargaining against execs who know that replacing creative workers with software comes at a very high price: the loss of copyright protection, which means that anyone can take that work, copy it, sell it, give it away, create sequels to it, and more. This is amazing news because it means that our collective bargaining with our bosses has tilted in our favor, and it’s only when we collectively bargain that we win. Remember: copyright has only expanded for four decades, and the profitability of the companies that sit between us and our audiences has likewise expanded — and we’ve only gotten poorer. When you’re a novelist bargaining with five publishing houses, or a filmmaker bargaining with four studios, or a musician bargaining with three labels, or a games author bargaining with two mobile app stores, or an audiobook creator bargaining with one audiobook store, then getting more copyright doesn’t make you any better off. Giving more copyright to a creative worker under those circumstances is like giving your bullied schoolkid extra lunch money. It doesn’t matter how much lunch money you give your kid — the bullies are just going to take it. In those circumstances, giving your kid extra lunch money is just an indirect way of giving the bullies more money. Give the bullies enough money and they’ll be able to afford an international ad campaign: Think of the hungry children! Give them more lunch money! The idea of protecting creators with individual, bargainable rights reframes us not as workers but as businesses: LLCs with MFAs. Our negotiations with our bosses are B2B: just two artificial persons, each with its own EIN, facing each other down across a board-room table. But the individual creative worker who bargains with Disney-ABC-Muppets-Pixar-Marvel-Lucasfilm-Fox is not in a situation comparable to, say, Coca-Cola renewing its sponsorship deal for Disneyland. For an individual worker, the bargain goes like this: “We’ll take everything we can, and give you as little as we can get away with, and maybe we won’t even pay you that.” Removing copyright from AI-generated works means that the right to create an AI-generated work from our creative labor isn’t something we can’t be forced to give up, because we don’t own it. That right doesn’t even exist. It really doesn’t exist, you know. Within the ranks of copyright scholars, there’s very little controversy over the question of training an AI and copyright. It’s just not a copyright infringement to make a temporary copy of a work for the purposes of subjecting it to mathematical analysis. It’s not a copyright infringement to perform that analysis. It’s not a copyright infringement to make a model out of that analysis. It can be a copyright infringement to use that model. It’s pretty clear that models can be used to create infringing works, but not everything a model generates is infringing, and liability for infringement is with the user of the model, not its maker. And yet, there are lawyers are suing AI companies for hundreds of millions of dollars, on behalf of creative workers, claiming that the companies’ products infringe. Do they know something the rest of copyright academia doesn’t? Here’s what I think: I think these lawyers are betting that the AI companies did incredibly sleazy — and possibly illegal — things to gather, sort and filter the training data for their models. I think they’re betting that AI companies — with billions in investment capital from worker-hating investors who want to put us all on the breadline — will pay gigantic sums rather than go through discovery and have their dirty secrets aired. And you know what? Fine. I have no problem with plaintiff-side lawyers sucking hundreds of millions of dollars out of sleazy AI companies on behalf of artists. But that doesn’t mean that these lawyers aren’t doing something dangerous by lying to creators about how copyright works. At best, they’re peddling hopium; at worst: disinformation. Because while copyright doesn’t currently let creators prevent temporary copies, mathematical analysis and model creation, we could always change copyright to do so. Every expansion of copyright over the past forty years — the expansions that made entertainment giants richer as artists got poorer — was enacted in the name of “protecting artists.” The five publishers, four studios and three record labels know that they are unsympathetic figures when they lobby Congress for more exclusive rights (doubly so right now, after their mustache-twirling denunciations of creative workers picketing outside their gates). The only way they could successfully lobby for another expansion on copyright, an exclusive right to train a model, is by claiming they’re doing it for us — for creative workers. But they hate us. They don’t want to pay us, ever. The only reason they’d lobby for that new AI training right is because they believe — correctly — that they can force us to sign it over to them. The bullies want your kid to get as much lunch money as possible. AI has lots of problems: it is an environmental disaster. It inflicts cruelty on clickworkers in the global south who vet its training data. It turbocharges monopoly. It cloaks terrible bias in a false empiricism. It is wildly unreliable. It is a stock bubble and an enshittification accelerant, marketed as a means of replacing skilled workers with broken automation whose substandard goods we will have no choice but to consume. Creative workers should be furious with AI companies, and with our bosses who want to replace us with their products. We should be building guillotines on their lawns — and picketing in front of their gates. But we shouldn’t fall into the trap of imagining ourselves to be LLCs with MFAs. We are not corporations bargaining as equals with other parts of our supply chain. We are workers, whose power comes from collective struggle, not individual rights that can — and will — be bargained away at the first opportunity.
Copy and paste this URL into your WordPress site to embed
Copy and paste this code into your site to embed