When the UK High Court dropped its November 2025 judgment in Getty Images v. Stability AI, the tech world briefly stopped doomscrolling to witness something rare: a legal system trying to understand how machine learning actually works. That alone deserves a medal, or at least a commemorative mug.
At the heart of the case was a deceptively simple question: If an AI model looks at your images during training, is the model itself a copy of those images?
Getty screamed “YES!” Stability AI countered “Absolutely not, and also, have you seen how neural networks work?”
And Justice essentially replied: “Calm down, everyone. Let’s talk about statistics.”

The Legal Plot: When Is a Copy… Not a Copy?
Getty’s original claim was expansive—so expansive it could have doubled as a new Marvel multiverse. They argued that because Stable Diffusion was trained on Getty images, the resulting model must be an “infringing copy” of those images under UK copyright law.
That argument required the court to accept a fascinating premise: that a giant floating matrix of 860 million numbers (the model weights) is functionally equivalent to a copyrighted photograph. It’s a bit like claiming someone who read your book now contains a pirated version of it inside their brain. A poetic thought, and terrifying if true.
The judge, mercifully, took a more grounded view.
She ruled that unless an AI model contains copyrighted content, it cannot be an infringing copy. And Stable Diffusion, for all its quirks, does not store or reproduce Getty’s images. It stores patterns, statistical relationships, and the faint digital equivalent of vibes.
That is not a copy.
It’s a souvenir.
The Technical Core: Neural Networks Don’t Hoard JPEGs
One delightful moment of the ruling was the court carefully working through how machine learning training actually functions. Because unlike human memory, a diffusion model doesn’t remember what it saw—only how things tend to look.
Training a model is basically a repetitive ritual:
- Show the model many images.
- Let it make terrible predictions.
- Punish the model until it becomes better at guessing.
- Repeat approximately a gazillion times.
After this extended bootcamp, the original images vanish like digital mayflies. What remains is a huge network of tuned parameters—patterns, not pixels; abstractions, not JPEGs.
If you look inside the model, Getty’s photos are nowhere to be found. The model can’t “fetch” them because it never stored them. The court agreed: training involves temporary copying, but the model itself is not a copy of anything.
This distinction—learning vs. reproducing—turned out to be the fulcrum of the entire case.
The One Place Getty Scored: The Watermark Fiasco
In a twist worthy of British comedy, Getty did win one point: yes, some early versions of Stable Diffusion occasionally generated images with something resembling the Getty Images watermark.
This wasn’t copyright infringement—but it was trademark infringement. A watermark is a brand identifier, not creative expression, and AI models are emphatically not allowed to cosplay corporate logos.
Stability patched the problem, the court noted the win, and the internet got a new meme: the accidental watermark tattoo.
The Bigger Legal Question (Still Unanswered): Is AI Training Legal?
Here’s the delightful paradox:
The court ruled that the model is not an infringing copy, but did not rule on whether the training process is legal under UK copyright law.
Why? Because the allegedly naughty training data was processed outside the UK.
This means the central question of our era—is training on copyrighted data without permission legal?—remains suspended in a kind of Schrödinger’s Copyright Box.
Depending on who you ask, the answer is:
• a fundamental right for technological progress,
• a parasitic extraction of creative labour,
or
• something for the government to put off until at least one general election later.
Why This Ruling Matters
The High Court effectively declared:
AI models are not secret containers of pirated content. They are abstract statistical machines.
This is a substantial relief for AI developers, who can now breathe without inhaling legal dust every time they load a model.
But for creators, it’s alarming:
their work can feed an AI’s “learning” without any legal requirement for transparency, compensation, or even basic decency.
Both sides agree on one thing: the law is wildly out of date. The UK government knows this and is already exploring reforms around data mining, licensing, and transparency (for example, under the Data Use & Access Act 2025).
This is the real story:
A misshapen legal framework built for photocopiers is being asked to referee machine intelligence.
It’s like asking a Victorian chimney sweep to regulate drone traffic.
Conclusion: A Landmark Ruling in a Legal Era Still Under Construction
The Getty vs Stability ruling clarifies one crucial point:
Training an AI is not the same as embedding copyrighted works into a model.
But it leaves open the hardest question of all:
Who owns the right to use the data that trains our machines?
This case won’t be the last.
The next battles—transparency mandates, opt-out registries, collective licensing, data provenance—are already gathering momentum.
For now, AI developers celebrate, creators worry, and lawmakers sharpen their pencils. A new chapter in tech law is unfolding, one footnote at a time.
The machines may not copy, but the legal system certainly will—adapting, borrowing, and remixing as it writes the rules for our algorithmic future.
Se vuoi, posso aiutarti a trasformare questo articolo in una versione per LinkedIn con formattazione specifica, oppure renderlo più breve, più pungente o più tecnico.
Read some sources:
- https://www.wiggin.co.uk/insight/getty-images-v-stability-ai-high-court-delivers-landmark-judgment/
- https://www.traverssmith.com/knowledge/knowledge-container/a-limited-win-for-the-ai-industry-no-secondary-infringement-in-landmark-uk-copyright-case/
- https://www.osborneclarke.com/insights/getty-v-stability-ai-stability-ai-generates-big-win-english-courts-landmark-first-judgment
- https://www.theguardian.com/media/2025/nov/04/stabilty-ai-high-court-getty-images-copyright


