Top
Best
New

Posted by dahlia 12 hours ago

Is legal the same as legitimate: AI reimplementation and the erosion of copyleft(writings.hongminhee.org)
355 points | 380 commentspage 7
delichon 10 hours ago|
Imagine if the author has his way, and when we have AI write software, it becomes legally under the license of some other sufficiently similar piece of software. Which may or may not be proprietary. "I see you have generated a todo app very similar to Todoist. So they now own it." That does not seem like a good path either for open source software or for opening up the benefits of AI generated software.
throwaway2027 10 hours ago||
Perhaps software patents may play an even bigger role in the future.
intrasight 10 hours ago|
Or, hopefully, even less of a role.
mbgerring 7 hours ago||
See also "A Declaration of the Independence of Cyberspace" (https://www.eff.org/cyberspace-independence), and what a goofy, naive, misguided disaster that early internet optimism turned into.

No, AI does not mean the end of either copyright or copyleft, it means that the laws need to catch up. And they should, and they will.

jongjong 4 hours ago||
There is a definite issue in terms of legitimacy and I also think there are some issues in the wording of certain open source licenses like MIT which give rights to 'Any person obtaining a copy of this software'.

Firstly, an AI agent is not a person. Secondly, the MIT license doesn't offer any rights to the code itself; it says a 'copy of the software' - That's what people are given the right to. It says nothing about the code and in terms of the software, it still requires attribution. Attribution of use and distribution of the software (or parts) is required regardless of the copyright aspect. AI agents are redistributing the software, not the code.

The MIT license makes a clear distinction between code and software. It doesn't cede any rights to the code.

And then, in the spirit of copyright; it was designed to protect the financial interests of the authors. The 'fair use' carve-out was meant for cases which do not have an adverse market impact on the author which it clearly does; at least in the cases highlighted in this article.

ajross 4 hours ago||
This take, which I've seen in a few different places now, seems 100% bonkers. A world where anyone can cheaply reimplement anyone else's software and use it on hardware of their own choosing in their own designs and for their own purposes is a free software utopia.

This isn't a problem, this is the goal. GNU was born when RMS couldn't use a printer the way he wanted because of an unmodifiable proprietary driver. That kind of thing just won't happen in the vibe coded future.

api 4 hours ago||
It also erodes copyright. A decent amount of commercial software can be AI cloned with no copyright violation.

A lot of SaaS too, especially if AI can run a simple deploy.

We might be approaching a huge deflationary catastrophe in the cost of a lot of software. It’s not a catastrophe for the consumer but it is for the industry.

panny 4 hours ago||
Both sides are wrong on this actually. Computer generated code has no copyright protection.

>The U.S. Copyright Office (USCO) and federal courts have consistently ruled that AI-generated works—where the expressive elements are determined by the machine, even in response to a human prompt—lack the necessary human creative input and therefore cannot be copyrighted.

All this code is public domain. Your employees can publish "your" AI generated code freely and it won't matter how many tokens you spent generating it. It is not covered by copyright.

martin-t 5 hours ago||
1) Legality and morality are obviously different and unrelated concepts. More people should understand that.

2) Copyright was the wrong mechanism to use for code from the start, LLMs just exposed the issue. The thing to protect shouldn't be creativity, it should be human work - any kind of work.

The hard part of programming isn't creativity, it's making correct decisions. It's getting the information you need to make them. Figuring out and understanding the problem you're trying to solve, whether it's a complex mathematical problem or a customer's need. And then evaluating solutions until you find the right one. (One constrains being how much time you can spend on it.)

All that work is incredibly valuable but once the solution exists, it's each easier to copy without replicating or even understanding the thought process which led to it. But that thought process took time and effort.

The person who did the work deserved credit and compensation.

And he deserves it transitively, if his work is used to build other works - proportional to his contribution. The hard part is quantifying it, of course. But a lot of people these days benefit from throwing their hands up and saying we can't quantify it exactly so let's make it finders keepers. That's exploitation.

3) Both LLM training and inference are derivative works by any reasonable meaning of those words. If LLMs are not derivative works of the training data then why is so much training data needed? Why don't they just build AI from scratch? Because they can't. They just claim they found a legal loophole to exploit other people's work without consent.

I am still hoping the legal people take time to understand how LLMs work, how other algorithms, such as synonym replacement or c2rust work, decide that calling it "AI" doesn't magically remove copyright and the huge AI companies will be forced to destroy their existing models and train new ones which respect the licenses.

mfabbri77 10 hours ago||
What if someone doesn't declare that it has been reimplemented using an LLM? Isn't it enough to simply declare that you have reimplemented the software without using an LLM? Good luck proving that in court...

One thing is certain, however: copyleft licenses will disappear: If I can't control the redistribution of my code (through a GPL or similar license), I choose to develop it in closed source.

bigyabai 10 hours ago|
Arguably, the GPL has always been the wrong choice if you want to authoritatively control redistribution.
animitronix 6 hours ago|
LPGL is dead, long live the AI rewrites of your barely open source code
More comments...