Top
Best
New

Posted by amarsahinovic 1 day ago

AI is a business model stress test(dri.es)
327 points | 318 commentspage 4
dev1ycan 10 hours ago||
2-5 years from now after the AI bubble bursts, and they are trying to rent us $300 PCs since every component is 5x the price, we will look back at all the damage and copyright law that was completely bypassed and ignored when it was convenient, after all those years of claiming evil China "stole" from companies (only to then pass laws where they can virtually steal anything they want, even utilizing private repositories on Github that they acquired by buying the site, completely ignoring the licenses)...

Or how Meta downloaded 70tb+ of books and then got law enforcement to nuke libgen and z-lib to create a "moat", and all our tools start dying/disappearing because the developers are laid off since an AI "search engine" just regurgitates it, THEN and only then will most people understand the mistake that this was.

Let's not even begin with what Grok just recently did to women on X, completely unacceptable, I really, really wish for the EU to grow some and take a stand, it is clear that China is just as predatory as America and both are willing to burn it all in order to get a non existent lead in non existent "technology" that snake oil salesmen have convinced 80 year olds in government that is the next "revolution".

keeda 1 day ago||
One of the biggest shortcomings of Open Source was that it implicitly defaulted to a volunteer model and so financing the work was always left as an exercise for the reader.

Hence (as TFA points out) open source code from commercial entities was just a marketing channel and source of free labor... err, community contributions... to auxiliary offerings that actually made money. This basic economic drive is totally natural but creates dynamics that lead to suboptimal behaviors and controversy multiple times.

For instance, a favorite business model is charging for support. Another one was charging for a convenient packaging or hosting of an “open core” project. In either case, the incentives just didn’t align towards making the software bug-free and easily usable, because that would actively hamper monetization. This led to instances of pathological behavior, like Red Hat futzing with its patches or pay-walling its source code to hamper other Linux vendors.

Then there were cases where the "open source" branding was used to get market-share, but licenses restricted usage in lucrative applications, like Sun with Java. But worse, often a bigger fish swooped in to take the code, as they were legally allowed to, and repackage it in their own products undercutting the original owners. E.g. Google worked around Sun's licensing restrictions to use Java completely for free in Android. And then ironically Android itself was marketed as "open source" while its licensing came with its own extremely onerous restrictions to prevent true competition.

Or all those cases when hyperscalers undercut the original owners’ offerings by providing open source projects as proprietary Software as a Service.

All this in turn led to all sorts of controversies like lawsuits or companies rug-pulling its community with a license change.

And aside from all that, the same pressures regularly led to the “enshittification” of software.

Open Source is largely a socialist (or even communist) movement, but businesses exist in a fundamentally capitalistic society. The tensions between those philosophies were inevitable. Socialists gonna socialize, but capitalists gonna capitalize.

With AI, current OSS business models may soon be dead. And personally I would think, to the extent they were based on misaligned incentives or unhealthy dynamics, good riddance!

Open Source itself will not go away, but it will enter a new era. The cost of code has dropped so much, monetizing will be hard. But by the same token, it will encourage people, having invested so much fewer resources creating it, to release their code for free. A lot of it will be slop, but the quantity will be overwhelming.

It’s not clear how this era will pan out, but interesting times ahead.

beeboop0 7 hours ago||
[dead]
sora2video 17 hours ago||
[dead]
antirez 1 day ago||
> You can't prompt 99.95% uptime on Black Friday. Neither can you prompt your way to keeping a site secure, updated, and running.

This is completely wrong. Agents will not just be able to write code, like they do now, but will also be able to handle operations, security, continuing to check, and improve the systems, tirelessly.

somebehemoth 1 day ago||
And someday we will have truly autonomous driving cars, we will cure cancer, and humans will visit Mars.

You can't prompt this today, are you suggesting this might come literally tomorrow? 10 years? 30? At that unknown time will your comment become relevant?

chazhaz 13 hours ago||
the quoted comment is arguing that devops will never be promptable — putting aside the discussion about whether or not that's true today, the argument here is that it's not likely to _never_ be possible
gck1 1 day ago|||
I'm working on a project now and what you're saying is already true. I have agents that are able to handle other things apart from code.

But these are MY agents. They are given access to MY domain knowledge in the way that I configured. They have rules as defined by ME over the course of multi-week research and decision making. And the interaction between my agents is also defined and enforced by me.

Can someone come up with a god-agent that will do all of this? Probably. Is it going to work in practice? Highly unlikely.

bopbopbop7 1 day ago|||
So you think a statement about the current state of things is wrong because you believe that sometime in the future agents are going to magically do everything? Great argument!
Culonavirus 1 day ago|||
To be able to do this requires perfect domain knowledge AND environment knowledge AND be able to think deeply about logical dominoes (event propagation through the system, you know, the small stuff that crashes cloudflare for the entire planet for example).

Please wake me up when Shopify lets a bunch of agentic LLMs run their backends without human control and constant supervision.

handfuloflight 1 day ago||
The extreme here is thinking machines will do everything. The reality is likely far closer to less humans being needed.
kachapopopow 1 day ago||
I know for a fact that all SOTA models have linux source code in them, intentionally or not which means that they should follow the GPL license terms and open-source part of the models which have created derivative works out of it.

yes, this is indirectly hinting that during training the GPL tainted code touches every single floating point value in a model making it derivative work - even the tokenizer isn't immune to this.

ronsor 1 day ago||
> the tokenizer isn't immune to this

A tokenizer's set of tokens isn't copyrightable in the first place, so it can't really be a derivative work of anything.

kachapopopow 1 day ago||
GPL however, does put restrictions on it, even the tokenizer. It was specifically crafted in a way where even if you do not have any GPL licensed sourcecode in your project, but it was built on top of it you are still binded by GPL limitations.

the only reason usermode is not affected is because they have an exclusion for it and only via defined communication protocol, if you go around it or attempt to put a workaround in the kernel guess what: it still violates the license - point is: it is very restrictive.

ronsor 1 day ago||
> GPL however, does put restrictions on it, even the tokenizer. It was specifically crafted in a way where even if you do not have any GPL licensed sourcecode in your project, but it was built on top of it you are still binded by GPL limitations.

This is not how copyright law works. The GPL is a copyright license, as stated by the FSF. Something which is not subject to copyright cannot be subject to a copyright license.

kachapopopow 1 day ago||
GPL is not only a copyright license, it also covers multiple types of intellectual property rights. Especially when you consider GPL-3 which has explicit IP protection while GPL-2 is implicit, so yah you're partially right for GPL-2 and wrong for GPL-3.
ronsor 1 day ago||
It's true that GPLv3 covers patents, but it is still primarily a copyright license.

The tokenizer's tokens aren't patented, for sure. They can't be trademarked (they don't identify a product or service). They aren't a trade secret (the data is public). They aren't copyrighted (not a creative work). And the GPL explicitly preserves fair use rights, so there are no contractual restrictions either.

A tokenizer is effectively a list of the top-n most common byte sequences. There's simply no basis in law for it to be subject to copyright or any other IP law in the average situation.

kachapopopow 1 day ago||
I mean okay sure, there is no legal framework for tokenizers, but what about the rest of the model I think there is a much stronger argument there? And you could realistically extend the logic that if the model is GPL-2.0 licensed you have to provide all the tools to replicate it which would include the tokenizer.
chaos_emergent 1 day ago||
When you say “in” them, are you referring to their training data, or their model weights, or the infrastructure required to run them?
kachapopopow 5 hours ago||
GPL can be considered like a virus, something based on GPL licensed code (unless explicitely excluded by the license) is now GPL licensed so the 'injected' training data becomes GPL licensed which means that created model weights from them in theory should also become GPL licensed.
MangoCoffee 1 day ago|
>Open Source was never the commercial product. It's the conduit to something else.

this is correct. If you open source your software, then why are you mad when companies like AWS, OpenAI, etc. make tons of money?

Open Source software is always a bridge that leads to something else to commercialize on. If you want to sell software, then pick Microsoft's model and sell your software as closed source. If you get mad and cry about making money to sustain your open source project, then pick the right license for your business.

jeroenhd 1 day ago|
> then pick the right license for your business

That's one of the issues with AI, though; strongly copylefted software suddenly finds itself unable to enforce its license because "AI" gets a free pass on copyright for some reason.

Dual-licensing open source with business-unfriendly licensing used to be a pretty good way to sell software, but thanks to the absurd legal position AI models have managed to squeeze themselves into, that stopped in an instant.

zephen 22 hours ago||
Open source software helped to dramatically reduce the cost of paid software, because there is a now a minimum bar of functionality you have to produce in order to sell software.

And, in many cases, you had to produce that value yourself. GPL licensing lawsuits ensured this.

AI extracting value from software in such a way that the creators no longer can take the small scraps they were willing to live on seems likely to change this dynamic.

I expect no-source-available software (including shareware) to proliferate again, to the detriment of open source.