Top
Best
New

Posted by zoobab 10/28/2025

The human only public license(vanderessen.com)
149 points | 123 commentspage 4
tptacek 10/28/2025|
Two questions:

1. Does an AI "reading" source code that has been otherwise lawfully obtained infringe copyright? Is this even enforceable?

2. Why write a new license rather than just adding a rider to the AGPL? This is missing language the AGPL uses to cover usage (rather than just copying) of software.

tpmoney 10/28/2025||
> Does an AI "reading" source code that has been otherwise lawfully obtained infringe copyright?

To the extent that this has been decided under US law, no. AI training on legally acquired material has been deemed fair use.

giancarlostoro 10/28/2025|||
At first I was going to comment how much I personally avoid the AGPL, but now you've got me thinking, technically, any LLM training off AGPL code or even GPL or similar code is very likely violating those licenses regardless of how it is worded. The GPL already makes it so you cannot translate to another programming language to circumvent the license if I remember correctly. The AGPL should have such a similar clause.
LordDragonfang 10/28/2025|||
> The GPL already makes it so you cannot translate to another programming language to circumvent the license

The operative words are the last four there. GPL, and all other software licenses (copyleft or not), can only bind you as strongly as the underlying copyright law. They are providing a copyright license that grants the licensee favorable terms, but it's still fundamentally the same framework. Anything which is fair use under copyright is also going to be fair use under the GPL (and LLMs are probably transformative enough to be fair use, though that remains to be seen.)

tpmoney 10/28/2025||
> and LLMs are probably transformative enough to be fair use, though that remains to be seen.

Arguably, at least in the US, it has been seen. Unless someone comes up with a novel argument not already advanced in the Anthropic case about why training an AI on otherwise legally acquired material is not transformative enough to be fair use, I don't see how you could read the ruling any other way.

1gn15 10/28/2025||
I think people are holding on to hope that it gets appealed. Though you're right, the gavel has already fallen; training is fair use.
Zambyte 10/28/2025|||
If LLM training violates AGPL, it violates MIT. People focus too much on the copyleft terms of the *GPL licenses. MIT, and most permissive licenses, require attribution.

Honestly with how much focus there tends to be on *GPL in these discussions, I get the feeling that MIT style licenses tend to be the most frequently violated, because people treat it as public domain.

giancarlostoro 10/28/2025||
This is a good call out. What would it fundamentally change? MIT is a few hairs away from just publishing something under public domain is it not? There's the whole "there's no warranty or liability if this code blows up your potato" bit of the MIT, but good luck trying to reverse engineer from the LLM which project was responsible for your vibe coding a potato into exploding.
fwip 10/28/2025|||
To point one: Normally, no. However, this license does not ask that question, and says that if you let an AI read it, your license to use the software is now void.
tptacek 10/28/2025||
Can you actually do that in US law?
fwip 10/28/2025||
I definitely don't know enough to say either way. On the one hand, general contract law seems to say that the terms of a contract can be pretty much anything as long as it's not ambiguous or grossly unfair. On the other, some real lawyers even have doubts about the enforceability of some widely used software licenses. So I could see it going either way.
maxrmk 10/28/2025||
Do you think there _should_ be a legal mechanism for enforcing the kind of rules they're trying to create here? I have mixed feelings about it.
zkmon 10/28/2025||
The challenge would be with detecting violations and enforcing the rules.
warpspin 10/29/2025||
Seems somebody volunteered for the AI's torment nexus!
rgreekguy 10/28/2025||
But my definition of "human" might differ from yours!
ferguess_k 10/28/2025||
Man you are thinking about using the law as your weapon. Don't want to disappoint you, but those companies/people control lawmakers. You can't fight armies of lawyers in the court.
TechSquidTV 10/28/2025||
This was my favorite fallacy of Web3. "But look I have proof the government stole from me!", man you think they care?
lopsidedmarble 10/28/2025||
Ah yes, there's that craven willingness to abandon your own best interests to your oppressor that HN fosters and loves so much.

Demand better protections. Demand better pay.

Demand your rights. Demand accountability for oppressors.

pessimizer 10/28/2025||
The goofy thing is to think that you're the first person to have made a "demand" and that anyone cares about your "demand." The reason people are oppressed is not because they have failed to make a request not to be.

Real "let me speak to your manager" activism. You have to have been sheltered in a really extreme way not only to say things like this, but to listen to it without laughing.

Here's some unrequested advice: the way to make simple people follow you is to make them feel like leaders among people they feel superior to, and to make them feel like rebels among people they feel inferior to. Keep this in mind and introspect when you find yourself mindlessly sloganeering.

lopsidedmarble 10/28/2025|||
> The goofy thing is to think that you're the first person to have made a "demand" and that anyone cares about your "demand."

Unsure who you are addressing, but clearly its someone other than me.

Did you see where the OP implied that any activism is useless? Got any harsh words for that philosophy?

bigfishrunning 10/28/2025|||
You know what, pessimizer, you're right. We should all bow down and submit to our lord Sam Altman right away. We should shove everything we produce into his meat-grinder because there's nothing we can do about it.

The LLMs are harmful to the business of creating software. Full stop. Either we can do something about it (like expose the futility of licensing in general), or we can just die.

While I think this licensing effort is likely to be ignored, I applaud it and hope more things like this continue to be created. The silicon valley VC hose is truly evil.

Imustaskforhelp 10/28/2025||
>The idea is that any software published under this license would be forbidden to be used by AI. The scope of the AI ban is maximal. It is forbidden for AI to analyze the source code, but also to use the software. Even indirect use of the software is forbidden. If, for example, a backend system were to include such software, it would be forbidden for AI to make requests to such a system.

This is both interesting but at the same time IANAL but I have a question regarding the backends system

Suppose I have an AGPL software, think a photo editing web app and any customer then takes the photo and reshapes it or whatever and get a new photo, now saying that the new photo somehow becomes a part of AGPL is weird

but the same thing is happening here, if a backed service uses it, my question is, what if someone creates a local proxy to that backend service and then the AI scrapes that local proxy or think that someone copies the output and pastes it to an AI , I don't understand it since I feel like there isn't even a proper definition of AI so could it theoretically consider everything automated? What if it isn't AI which directly accesses it

Another thing is that it seems that the backend service could have a user input, think a backend service like codeberg / forejo / gitea etc.

if I host a git server using a software which uses hopl, wouldn't that also inherently somehow enforce a terms and condition on the code hosted in it

This seems a genuinely nice idea and I have a few interesting takes on it

Firstly, what if I take freebsd which is under permissive BSD iirc, try to add a hopl license to it (or its equivalent in future?) and then build an operating system

Now, technically wouldn't everything be a part of this new human only bsd (Hob) lol, and I am not sure but this idea sounds damn fascinating, imagine a cloud where I can just change the operating system and just mention it like proudly on HOB and it would try to enforce limits on AI

What I am more interesting about is text, can I theoretically write this comment under human only public license?

What if I create a service like mataroa but where the user who wants to write the blog specifies that the text itself would become hopl, as this can limit the sense of frustration on their part regarding AI knowing that they are trying to combat it

Also I am not sure if legally speaking this thing could be done, it just seems like a way so that people can legally enforce robots.txt if this thing works but I have its questions as I had shared, and even more

It would be funny if I wrote things with AI and then created a HOPL license

something like HOPL + https://brainmade.org/ could go absolutely bunkers for making a human interacts with human sort of thing or atleast trying to achieve that. It would be a fun social experiment if we could create a social media trying to create this but as I said, I doubt that it would work other than just trying to send a message right now but I may be wrong, I usually am

ukprogrammer 10/28/2025||
nice, another stupid license for my ai dataset scrapers to ignore, thanks!
slipperybeluga 10/28/2025|
[dead]