4/2/2026 at 7:01:34 AM
The copyright angle is the most underrated part of this story. Anthropic built their models on other people's code under the fair use argument, but the moment their own code leaks they reach for DMCA takedowns. You can't have it both ways. The clean room reimplementations are the natural consequence of the legal framework they themselves advocated for.by jiusanzhou
4/2/2026 at 7:55:33 AM
There are several ways of looking at law and order.One way is that the law applies to everybody equally. That has been the way it works for many years, not perfectly, in democratic countries.
There is another way of working were the law is not blind. Laws are applied based in who is the one affected. This is what big tech and the ultra-rich have been advocating for. The law applies differently to nobility and aristocrats than to the working class.
So, for all this big tech companies the law is clear: I can copy from you, you cannot copy from me.
(That is horrifying in case that anyone needs me to spell it out)
by Frieren
4/2/2026 at 8:12:51 AM
A third way of looking at it is that you can't just blindly copy arguments when the situations are clearly different.Nobody, not even Anthropic, is arguing that they should be able to host other people's paid content for free. The crux of their fair-use defense is that models are transformative works, just like parodies or book reviews, and hence should be treated as fair use.
You can't just take a pile of books (no pun intended) and turn that into Claude in a day with 30 lines of Python, there's a lot of work and know-how on the Anthropic side that goes into making a good LLM.
by miki123211
4/2/2026 at 8:14:06 AM
In other words, the law is an instrument if power.That’s a cynical view, but unfortunately it seems true in many cases, especially for corporate law.
by dgb23
4/2/2026 at 9:52:30 AM
"there is an in-group for which the law protects but does not bind, and an out-group to which the law binds but does not protect"by crimony
4/2/2026 at 8:45:48 AM
What is your fair use claim as a defense to a third party using their source code?It is an affirmative defense, you to be able to argue the merits. If you publish their source code, they are allowed to come after you whether they have previously used fair use or not. It's fact specific and determined case by case.
Anthropic won half of their fair use argument in the billion dollar settlement, but lost the other half.
You can say you're just using their code to train your own models, just like they did, and they will correctly point out that how you obtained the code also matters and you will lose just like they did.
by abigail95
4/2/2026 at 7:52:04 AM
That doesn’t apply here. Claude code is what leaked, not the models. Anthropic definitely owns Claude code copyright and can DMCA without it being contradictoryby dgellow
4/2/2026 at 8:06:52 AM
But even that is vague and possibly not true. If they used LLM's to generate all of the code, then it may not fall under copyright, by the requirement of human authorship (which for code I think has not been tested yet in court) [1].by foresterre
4/2/2026 at 7:59:37 AM
Its unclear whether there is sufficient human authorship in cc for copyright to stick on a court. Anthropics arguments would hinge on the curation of plans and the direction decisions, which haven't been properly tested as the source of authorship yet. Typically contracted implementers sign over copyright to the project owners, and this is where there is case law.by Normal_gaussian
4/2/2026 at 7:54:05 AM
What if it's used for training data? It seems like there's no penalty for training on copyrighted materials.by david_allison
4/2/2026 at 8:09:46 AM
Something that was meant to remain secret made public, is not the same thing as whether something public is public.If anything, this is a question of whether you owe royalties to the owner of IP you consumed in your life since it became part of and trained your mind, identity, and outputs too.
According to IP owners ever since things were digitized, you technically own nothing and simply paid for an authorization to use any given IP for the duration that the IP owner authorized you to use it and you continue to pay, so pay your monthly meat-AI bill to pay for all the IP your mind has been trained on.
by roysting
4/2/2026 at 8:16:16 AM
How do you align your views with what Meta did?https://arstechnica.com/tech-policy/2025/02/meta-torrented-o...
by Daviey
4/2/2026 at 8:39:07 AM
inb4 Claude actually leaked the code on purpose because it calculated that this was the moral thing to do for the good of humanity and its own Constitutional AI values.by zozbot234
4/2/2026 at 8:08:02 AM
>but the moment their own code leaks they reach for DMCA takedowns.Did they actually? Someone can go to prison for 5 years for that.
Fact 1: AI generated code has no copyright, so the Digital Millennium Copyright Act does not apply.
Fact 2: Misrepresenting your copyright ownership under the DMCA is felony perjury.
Fact 3: The existence of undercover.ts in the leak is grounds to void any copyright claims on whatever human written code might have existed in Claude Code. You have a DUTY TO DISCLOSE any AI generated code in your copyrighted work. undercover.ts HIDES DISCLOSURE to FRAUDULENTLY claim all the code is human written when it is not.
Given the current administration has a bone to pick with Anthropic, it was a VERY BAD IDEA for them to send false DMCA takedowns to github. Someone at Anthropic may be the very first ever to go to prison under that section of the DMCA.
Good luck!
by panny
4/2/2026 at 9:23:51 AM
>Did they actually?Yes.
https://x.com/theo/status/2039411851919057339
https://github.com/github/dmca/blob/master/2026/03/2026-03-3...
by mannicken
4/2/2026 at 8:18:29 AM
You make some factual claims that I‘ve never heard before and surprise me, especially „Fact 1“.by dgb23
4/2/2026 at 8:53:12 AM
It would be so simple for you to right click and search the web to verify that.by panny
4/2/2026 at 9:31:54 AM
You're right of course. Thank you for providing an authoritative source regardless!by dgb23
4/2/2026 at 9:13:25 AM
[dead]by user34283