Top
Best
New

Posted by amarsahinovic 1/10/2026

AI is a business model stress test(dri.es)
341 points | 339 commentspage 3
m4rtink 1/10/2026|
So AI is attempting to replace SAP as the traditional way of testing if you company is strong enough ?
geoffbp 1/10/2026||
> Value is shifting to operations: deployment, testing, rollbacks, observability. You can't prompt 99.95% uptime on Black Friday. Neither can you prompt your way to keeping a site secure, updated, and running.

I agree somewhat but eventually these can be automated with AI as well.

tetha 1/10/2026||
Unless you replace the entire workforce, you'd be surprised how much organizational work and soft skills are involved in an infrastructure at scale.

Like sure, there is a bunch of stuff like monitoring, alerting that is telling us that a database is filling up it's disk. This is already automated. It could also have automated remediation with tech from the 2000s with some simple rule-based systems (so you can understand why those misbehaved, instead of entirely opaque systems that just do whatever).

The thing is though, very often the problem isn't the disk filling up or fixing that.

The problem is rather figuring out what silly misbehavior the devs introduced, if a PM had a strange idea they did not validate, if this is backed by a business case and warrants more storage, if your upstream software has a bug, or whatever else. And then more stuff happens and you need to open support cases with your cloud provider because they just broke their API to resize disks, ...

And don't even get me started on trying to organize access management with a minimally organized project consulting team. Some ADFS config resulting from that is the trivial part.

Culonavirus 1/10/2026|||
If "99.95% uptime on Black Friday", and "keeping a site secure, updated, and running" can ever be automated (by which I mean not a toy site and not relying on sheer luck), not only 99.99% of people in IT are out of a job, but humans as intelligent beings are done. This is such a doomsday scenario that there's not even a point in discussing it.
bowmessage 1/10/2026|||
How? I am tired of these unfounded claims. Humans can’t even keep many sites secure.
g947o 1/10/2026||
Care to provide a prompt that leads to coding agent achieving 99.95% uptime on Black Friday as an example?
mrgoldenbrown 1/10/2026||
Calling it a stress test seems a bit off. Would we say that invention of lightbulbs was a "stress test" for candle related business models? Or would we just say that business models had to change in response to current events.
Twixes 1/10/2026|
Drop the "test". Just "stress" – it's cleaner.
dev1ycan 1/11/2026||
2-5 years from now after the AI bubble bursts, and they are trying to rent us $300 PCs since every component is 5x the price, we will look back at all the damage and copyright law that was completely bypassed and ignored when it was convenient, after all those years of claiming evil China "stole" from companies (only to then pass laws where they can virtually steal anything they want, even utilizing private repositories on Github that they acquired by buying the site, completely ignoring the licenses)...

Or how Meta downloaded 70tb+ of books and then got law enforcement to nuke libgen and z-lib to create a "moat", and all our tools start dying/disappearing because the developers are laid off since an AI "search engine" just regurgitates it, THEN and only then will most people understand the mistake that this was.

Let's not even begin with what Grok just recently did to women on X, completely unacceptable, I really, really wish for the EU to grow some and take a stand, it is clear that China is just as predatory as America and both are willing to burn it all in order to get a non existent lead in non existent "technology" that snake oil salesmen have convinced 80 year olds in government that is the next "revolution".

dnw 1/10/2026||
I'd note a couple of things:

Not to nitpick but if we are going to discuss the impact of AI, then I'd argue "AI commoditizes anything you can specify." is not broad enough. My intuition is "AI commoditizes anything you can _evaluate/assess_." For software automation we need reasonably accurate specifications as input and we can more or less predict the output. We spend a lot of time managing the ambiguity on the input. With AI that is flipped.

In AI engineering you can move the ambiguity from input to the output. For problems where there is a clear and cheaper way of evaluating the output the trade-off of moving the ambiguity is worth it. Sometimes we have to reframe the problem as an optimization problem to make it work but same trade-off.

On the business model front: [I am not talking specifically about Tailwind here.] AI is simply amplifying systemic problems most businesses just didn't acknowledge for a while. SEO died the day Google decided to show answer snippets a decade ago. Google as a reliable channel died the day Google started Local Services Advertisement. Businesses that relied on those channels were already bleeding slowly; AI just made it sudden.

On efficiency front, most enterprises could have been so much more efficient if they could actually build internal products to manage their own organizational complexity. They just could not because money was cheap so ROI wasn't quite there and even if ROI was there most of them didn't know how to build a product for themselves. Just saying "AI first" is making ROI work, for now, so everyone is saying AI efficiency. My litmus test is fairly naive: if you are growing and you found AI efficiency then that's great (e.g. FB) but if you're not growing and only thing AI could do for you is "efficiency" then there is a fundamental problem no AI can fix.

andrekandre 1/11/2026|

  > if you are growing and you found AI efficiency then that's great (e.g. FB) but if you're not growing and only thing AI could do for you is "efficiency" then there is a fundamental problem no AI can fix.
exactly, "efficiency" nice to say in a vacuum but what you really need is quality (all-round) and understanding your customer/market
terribleidea 1/10/2026||
Maybe they just over-hired for their business model.
browningstreet 1/10/2026||
Business & time are business model stress tests.
Schnitz 1/11/2026||
The root of the issue is that Tailwind was selling something that people can now recreate a bespoke version of in mere minutes using a coding agent. The other day I vibe coded a bespoke dependabot/renovate replacement in an hour. That was way easier than learning any of these tools and fighting their idiosyncrasies that don’t work for me. We no longer need Framer because you can prompt a corporate website faster than you can learn Framer. It is, fortunately or unfortunately, what it is and we all have to adapt.

I want to be clear, it sucks for Tailwind for sure and the LLM providers essentially found a new loophole (training) where you can smash and grab public goods and capture the value without giving anything back. A lot of capitalists would say it’s a genius move.

antirez 1/10/2026||
> You can't prompt 99.95% uptime on Black Friday. Neither can you prompt your way to keeping a site secure, updated, and running.

This is completely wrong. Agents will not just be able to write code, like they do now, but will also be able to handle operations, security, continuing to check, and improve the systems, tirelessly.

somebehemoth 1/10/2026||
And someday we will have truly autonomous driving cars, we will cure cancer, and humans will visit Mars.

You can't prompt this today, are you suggesting this might come literally tomorrow? 10 years? 30? At that unknown time will your comment become relevant?

chazhaz 1/11/2026||
the quoted comment is arguing that devops will never be promptable — putting aside the discussion about whether or not that's true today, the argument here is that it's not likely to _never_ be possible
gck1 1/10/2026|||
I'm working on a project now and what you're saying is already true. I have agents that are able to handle other things apart from code.

But these are MY agents. They are given access to MY domain knowledge in the way that I configured. They have rules as defined by ME over the course of multi-week research and decision making. And the interaction between my agents is also defined and enforced by me.

Can someone come up with a god-agent that will do all of this? Probably. Is it going to work in practice? Highly unlikely.

bopbopbop7 1/10/2026|||
So you think a statement about the current state of things is wrong because you believe that sometime in the future agents are going to magically do everything? Great argument!
Culonavirus 1/10/2026|||
To be able to do this requires perfect domain knowledge AND environment knowledge AND be able to think deeply about logical dominoes (event propagation through the system, you know, the small stuff that crashes cloudflare for the entire planet for example).

Please wake me up when Shopify lets a bunch of agentic LLMs run their backends without human control and constant supervision.

handfuloflight 1/10/2026||
The extreme here is thinking machines will do everything. The reality is likely far closer to less humans being needed.
tschellenbach 1/10/2026|
They could build something like Lovable but with better design/frontend defaults.
More comments...