Top
Best
New

Posted by todsacerdoti 6/25/2025

Define policy forbidding use of AI code generators(github.com)
551 points | 413 commentspage 2
flerchin 6/26/2025|
I suppose the practical effect will be that contributors who use AI will have to defend their code as if they did not. To me, this implies more ownership of the code and deep understanding of it. This exchange happens fairly often in PRs I'm involved with:

"Why did you do this insane thing?"

"IDK, claude suggested it and it works."

saurik 6/27/2025||
As someone who once worked on a product that had to carefully walk the line of legality, I haven't found any mention in this discussion of what I imagine is a key problem for qemu, that doesn't face other projects: as an emulator, they are already under a lot of scrutiny for legality, and so they are going to need to be a lot more conservative than other random projects with respect to increasing their legal risk.
maerF0x0 6/27/2025||
I was literally musing on a parallel subject yesterday morning, how engineers kinda did it to themselves with Open Source, LLM Code generators would probably not be possible without the large corpus of totally available (readable) lines of mostly valid code hosted in places like github.com .

It strikes me that Open Source was inevitable as companies and engineers found it economical (a good choice) to amortize the cost of building something across many of them, without as much legality/negotiation/constraints as if they did so in a closed source way collectively. It was kinda a sort of good will community thing. For FOSS, in the case of companies by not competing on things like a javascript framework, and instead on the product features themselves. Or, in the case of engineers by allowing themselves access to that same code across many employers.

Now that gambit is approaching the point where most projects which could be assembled by entirely FOSS (lots of it) is becoming easier and easier to generate. It'd take a 20(?) person team of expensive nerds weeks/months to build a website in 1995, but now a normal individual can simply ask for a website to be made by an LLM.

Across my ~30 years in software (eep), it seems that the table stakes for the minimum viable software keeps just growing and growing-- It used to be a push to have an API, now its minimum viable, it used to be a push to get "live" notificatoins (via polling!) now its minimum viable to push via websockets. Etc etc for the surviving set of features.

abhisek 6/26/2025||
> It's best to start strict and safe, then relax.

Makes total sense.

I am just wondering how do we differentiate between AI generated code and human written code that is influenced or copied from some unknown source. The same licensing problem may happen with human code as well especially for OSS where anyone can contribute.

Given the current usage, I am not sure if AI generated code has an identity of its own. It’s really a tool in the hand of a human.

catlifeonmars 6/26/2025|
> Given the current usage, I am not sure if AI generated code has an identity of its own. It’s really a tool in the hand of a human.

It’s a power saw. A really powerful tool that can be dangerous if used improperly. In that sense the code generator can have more or less of a mind of its own depending on the wielder.

Ok I think I’ve stretched the analogy to the breaking point…

zoobab 6/26/2025||
BigTech now control Qemu?

"Signed-off-by: Daniel P. Berrangé <berrange@redhat.com> Reviewed-by: Kevin Wolf <kwolf@redhat.com> Reviewed-by: Stefan Hajnoczi <stefanha@redhat.com> Reviewed-by: Alex Bennée <alex.bennee@linaro.org> Signed-off-by: Markus Armbruster <armbru@redhat.com> Signed-off-by: Stefan Hajnoczi <stefanha@redhat.com>"

tqwhite 6/26/2025||
I don't blame them for worrying about it. The policy should not be to forbid it but make sure you don't leave artifacts because I guarantee, people are going to use a bot to write their code. Hell, in six months, I doubt you will be able to get a code editor that doesn't use AI for code completion at least.

Also, AI coded programs will be copyrightable just like the old days. You think the big corps are going to both not use bot coding and give up ownership of their code? Fat chance.

Remember the Micky Mouse copyright extension? If the courts aren't sensible, we will have one of those the next day.

The old days ended very abruptly this time.

incomingpain 6/26/2025||
Using AI code generators. I have been able to get the code base large enough that it was starting to make nonsense changes.

However, my overall experience I have been thinking about how this is going to be a massive boon to open source. So many patches, so many new tools will be created to streamline getting new packages into repos. Everything can be tested.

Open source is going to be epicly boosted now.

QEMU deciding to sit out from this acceleration is crazy to me, but probably what is going to give Xen/Docker/Podman the lead.

caleblloyd 6/26/2025|
Signed by mostly people at RedHat, which is owned by IBM, which makes Watson, which beat humans in Jeopardy in 2011.

> These are early days of AI-assisted software development.

Are they? Or is this just IBM destroying another acquisition slowly.

Meanwhile the Dotnet Runtime is fully embracing AI. Which people on the outside may laugh at but you have extremely talented engineers like Stephen Toub and David Fowler advocating for it.

So enterprises: next time you have an IBM rep trying to sell you AI services, do yourself a favor and go to any other number of companies out there who are actually serious about helping you build for the future.

And since I am a North Carolina native, here’s to hoping IBM and RedHat get their stuff together.

More comments...