Top
Best
New

Posted by amarsahinovic 1/10/2026

AI is a business model stress test(dri.es)
341 points | 339 commentspage 4
renjimen 1/10/2026|
> You can't prompt 99.95% uptime on Black Friday. Neither can you prompt your way to keeping a site secure, updated, and running.

Uh, yeah you can. There’s a whole DevOps ecosystem of software and cloud services (accessible via infrastructure—as-code) that your agents can use to do this. I don’t think businesses who specialize in ops are safe from downsizing.

porkloin 1/11/2026|
Yep - exactly. Ops isn't immune to LLMs stealing your customers. Given that most of the "open source product with premium hosting" models are just reselling hyperscaler compute at a huge markup, the customers are going to realize pretty quickly that they can use an LLM to setup some basic devops and get the same uptime. Most of these companies are offering a middleman service that becomes a bad deal the moment the customer has access to expertise they previously lacked.

I also think he's glossing over the fact that one of the reasons why companies choose to pay for "ops" to run their software for them is because it's built by amateurs or amateurs-playing-professional and runs like shit. I happen to know this first hand from years of working at a company selling hosting and ops for the exact same CMS that Dries' business hosts (Drupal, a PHP-based CMS) and the absolute garbage that some people are able to put together in frameworks like Wordpress and Drupal is truly astounding. I'm not even talking about the janky local businesses where their nephew who was handy with computers made them a Wordpress site - big multinational companies have sites in these frameworks that can barely handle 1x their normal traffic and more or less explode at 1.5x.

The business of hosting these customers' poorly optimized garbage remains a big business. But we're entering into an era where the people who produce poorly optimized software have a different path to take rather than throwing it to a SaaS platform that can through sheer force of will make their lead-weight airplane fly. They can spend orders of magnitude less money to pay an LLM to make the software actually just not run like shit in the first place. Throwing scaling at the problem of 99.95% is a blunt instrument that only works if the person paying doesn't have the time, money, or knowledge to do it themselves.

Companies like these (including the one I work for currently) are absolutely going to get squeezed from both directions. The ceiling is coming down as more realize they can do their own devops, and the floor is rising as customer code quality gets better. Eventually you have to try your best to be 3 ft tall instead of 6.

leosanchez 1/11/2026||
I wonder how much impact shadcn had on their business.
keeda 1/10/2026||
One of the biggest shortcomings of Open Source was that it implicitly defaulted to a volunteer model and so financing the work was always left as an exercise for the reader.

Hence (as TFA points out) open source code from commercial entities was just a marketing channel and source of free labor... err, community contributions... to auxiliary offerings that actually made money. This basic economic drive is totally natural but creates dynamics that lead to suboptimal behaviors and controversy multiple times.

For instance, a favorite business model is charging for support. Another one was charging for a convenient packaging or hosting of an “open core” project. In either case, the incentives just didn’t align towards making the software bug-free and easily usable, because that would actively hamper monetization. This led to instances of pathological behavior, like Red Hat futzing with its patches or pay-walling its source code to hamper other Linux vendors.

Then there were cases where the "open source" branding was used to get market-share, but licenses restricted usage in lucrative applications, like Sun with Java. But worse, often a bigger fish swooped in to take the code, as they were legally allowed to, and repackage it in their own products undercutting the original owners. E.g. Google worked around Sun's licensing restrictions to use Java completely for free in Android. And then ironically Android itself was marketed as "open source" while its licensing came with its own extremely onerous restrictions to prevent true competition.

Or all those cases when hyperscalers undercut the original owners’ offerings by providing open source projects as proprietary Software as a Service.

All this in turn led to all sorts of controversies like lawsuits or companies rug-pulling its community with a license change.

And aside from all that, the same pressures regularly led to the “enshittification” of software.

Open Source is largely a socialist (or even communist) movement, but businesses exist in a fundamentally capitalistic society. The tensions between those philosophies were inevitable. Socialists gonna socialize, but capitalists gonna capitalize.

With AI, current OSS business models may soon be dead. And personally I would think, to the extent they were based on misaligned incentives or unhealthy dynamics, good riddance!

Open Source itself will not go away, but it will enter a new era. The cost of code has dropped so much, monetizing will be hard. But by the same token, it will encourage people, having invested so much fewer resources creating it, to release their code for free. A lot of it will be slop, but the quantity will be overwhelming.

It’s not clear how this era will pan out, but interesting times ahead.

beeboop0 1/11/2026||
[dead]
sora2video 1/11/2026||
[dead]
kachapopopow 1/10/2026||
I know for a fact that all SOTA models have linux source code in them, intentionally or not which means that they should follow the GPL license terms and open-source part of the models which have created derivative works out of it.

yes, this is indirectly hinting that during training the GPL tainted code touches every single floating point value in a model making it derivative work - even the tokenizer isn't immune to this.

ronsor 1/10/2026||
> the tokenizer isn't immune to this

A tokenizer's set of tokens isn't copyrightable in the first place, so it can't really be a derivative work of anything.

kachapopopow 1/10/2026||
GPL however, does put restrictions on it, even the tokenizer. It was specifically crafted in a way where even if you do not have any GPL licensed sourcecode in your project, but it was built on top of it you are still binded by GPL limitations.

the only reason usermode is not affected is because they have an exclusion for it and only via defined communication protocol, if you go around it or attempt to put a workaround in the kernel guess what: it still violates the license - point is: it is very restrictive.

ronsor 1/10/2026||
> GPL however, does put restrictions on it, even the tokenizer. It was specifically crafted in a way where even if you do not have any GPL licensed sourcecode in your project, but it was built on top of it you are still binded by GPL limitations.

This is not how copyright law works. The GPL is a copyright license, as stated by the FSF. Something which is not subject to copyright cannot be subject to a copyright license.

kachapopopow 1/10/2026||
GPL is not only a copyright license, it also covers multiple types of intellectual property rights. Especially when you consider GPL-3 which has explicit IP protection while GPL-2 is implicit, so yah you're partially right for GPL-2 and wrong for GPL-3.
ronsor 1/10/2026||
It's true that GPLv3 covers patents, but it is still primarily a copyright license.

The tokenizer's tokens aren't patented, for sure. They can't be trademarked (they don't identify a product or service). They aren't a trade secret (the data is public). They aren't copyrighted (not a creative work). And the GPL explicitly preserves fair use rights, so there are no contractual restrictions either.

A tokenizer is effectively a list of the top-n most common byte sequences. There's simply no basis in law for it to be subject to copyright or any other IP law in the average situation.

kachapopopow 1/10/2026||
I mean okay sure, there is no legal framework for tokenizers, but what about the rest of the model I think there is a much stronger argument there? And you could realistically extend the logic that if the model is GPL-2.0 licensed you have to provide all the tools to replicate it which would include the tokenizer.
chaos_emergent 1/11/2026||
When you say “in” them, are you referring to their training data, or their model weights, or the infrastructure required to run them?
kachapopopow 1/11/2026||
GPL can be considered like a virus, something based on GPL licensed code (unless explicitely excluded by the license) is now GPL licensed so the 'injected' training data becomes GPL licensed which means that created model weights from them in theory should also become GPL licensed.
MangoCoffee 1/10/2026|
>Open Source was never the commercial product. It's the conduit to something else.

this is correct. If you open source your software, then why are you mad when companies like AWS, OpenAI, etc. make tons of money?

Open Source software is always a bridge that leads to something else to commercialize on. If you want to sell software, then pick Microsoft's model and sell your software as closed source. If you get mad and cry about making money to sustain your open source project, then pick the right license for your business.

jeroenhd 1/10/2026|
> then pick the right license for your business

That's one of the issues with AI, though; strongly copylefted software suddenly finds itself unable to enforce its license because "AI" gets a free pass on copyright for some reason.

Dual-licensing open source with business-unfriendly licensing used to be a pretty good way to sell software, but thanks to the absurd legal position AI models have managed to squeeze themselves into, that stopped in an instant.

zephen 1/11/2026||
Open source software helped to dramatically reduce the cost of paid software, because there is a now a minimum bar of functionality you have to produce in order to sell software.

And, in many cases, you had to produce that value yourself. GPL licensing lawsuits ensured this.

AI extracting value from software in such a way that the creators no longer can take the small scraps they were willing to live on seems likely to change this dynamic.

I expect no-source-available software (including shareware) to proliferate again, to the detriment of open source.