Posted by ripe 9/1/2025
1) High-quality training data is effectively exhausted. The next 10× scale model would need 10× more tokens than exist.
2) The Chinchilla rule. Hardware gets 2× cheaper every 18 mo, but model budgets rise 4× in that span. Every flagship LLM therefore costs 2× more than the last, while knock-off models appear years later for pennies. Benchmark gains shrink and regulation piles on. Net result: each new dollar on the next big LLM now buys far less payoff. The "wait-and-copy" option is getting cheaper every day.
But I agree with the following statement Matt Garman gave recently;
Amazon Web Services CEO Matt Garman said that using AI tools in place of junior employees was "one of the dumbest things I've ever heard" because these employees are "the least expensive" and "the most leaned into your AI tools."
It's because AI usually creates slop, without review these "slop" build up. We don't have infinite context window to solve the slop anyway. (even if we do, the context-rot has been confirmed)Also, on average, Indian non-Tech employees who manages thousands of spreadsheets or manually manages your in-store cameras are much more cheaper than the "tokens" and the NVIDIA GPUs you can throw at the problem, at least for now and a foreseeable future.
I don't think his point was we should hire junior engineers because they're cheap and lean into AI and AI produces slop. His position is not that he wants to cheaply create slop.
He wants to hire people who are cheap and love using AI because he sees that as a better long term strategy than making senior engineers embrace AI late into their career.
https://www.cnbc.com/2023/11/17/amazon-cuts-several-hundred-...
Of course not being able to monetise Alexa has always been a problem, but these and the article's issues are all to do with poor planning and top tier business direction.
Meanwhile, the models are getting larger and more complex, with more users, putting the support infrastructure well beyond what individuals and even small companies can afford to outright buy. You can easily spend well over a million on even basic infrastructure to try to support some of the newer models and make it available to a few end users.
As a point of strategy for individuals and small entities, it really is cheaper in this case to spin up some AWS instances for a bit to do some LLM work and then spin them down when not in use.
So if you were AWS do you mine for gold? Or do you sell shovels?
AWS, Azure, GCP weren’t just renting servers. They built whole platforms - databases, ML stacks, dev tools, security. Way more than shovels.
The moat was owning the stack. MS used Azure to power Office and now Copilot. Google used infra to juice Search, YouTube, Ads. Even Amazon used it for retail + Alexa. They were mining gold and selling shovels.
And raw compute was never where the money was. Renting VMs was the cheap layer. The profits came from all the higher level services built on top.
Now with AI it’s even more obvious:
Models drive the workloads. OpenAI/Anthropic/DeepMind aren’t just customers, they’re shaping the infra itself. Whoever owns the models sets the rules.
No models = no moat. If AWS isn’t building frontier models, it’s just reselling Nvidia GPUs while MS + Google wrap their clouds around first party models + SDKs. That pulls customers deeper into their stacks, not Amazon’s.
Falling behind compounds. Training/deploying models forces infra breakthroughs (chips, compilers, scaling). If AWS isn’t in that game, they’ll eventually struggle to even run other ppl’s models as well as rivals.
So if Amazon “sits this one out,” it’s not just losing bragging rights. It’s giving up control of the future of compute.
But I think you are making it sound like Amazon's moat is that it came up with its own technology behind its services.
A lot of times AWS was just grabbing a bunch of popular open source stuff off the shelf and hosting it (e.g., RDS, EKS, etc). Yes there is some R&D work but almost none of what Amazon has come up with is rooted in their own work.
The value they give you is the hosting, maintenance, and compliance of all these services. If you're paying AWS extra to host your database on RDS or your Kubernetes cluster in EKS, you're generally not paying AWS to come up with a better database than anyone else, you're just paying them to help you manage permissions, backups, replication, and other maintenance/compliance/management issues that a company needs for its internal services.
In other words, Amazon's AI customers don't need Amazon to build models. They just need Amazon to use someone else's models, host them on private enterprise compute that easily ties in to existing infrastructure, RBAC, etc, and make everything compliant and easy to maintain. A whole lot of the value is being able to answer audits with "AWS handles our database backups/data security/etc" rather than saying "we have a great ops team and here's all our proof that we handle our database backups/data security/etc properly."
I think it's actually explicitly Amazon's job to sit this one out, especially since they never successfully made a good business or consumer ecosystem device like a smartphone or PC operating system.
I’m not 100% convinced this is true. Additionally, I’m not convinced that a waiting pattern right now sets Amazon up for a point of no return. It seems plausible for Amazon to pull an Apple here, to wait until technology is more mature and use their unique position to provide a quality offering.
Not a whole lot in their portfolio actually has a lot of Amazon technology behind it. They've got some mild forks here and there, and they've got some stuff like Fargate that has AWS R&D work behind it but piggybacks concepts/tech stacks that definitely didn't originate from Amazon.
A lot of their value has really nothing to do with developing the underlying technology.
We saw this with crypto mining where truckloads of expensive GPUs were dumped in the trash after the proof of work became so hard it became not worth the cost of electricity to keep on that generation of card.
The physical server itself would be the wooden handle, I guess.
See, that’s the problem with what Amazon has done to you. It’s always about money with you guys. Good research is about the opposite of money. The people who don’t know what that means, who can’t fathom to understand what “the opposite of money” means without turning everything into a contrived story about money: they can’t do good R&D. Every single great R&D director will tell you this, and a bunch of people will downvote this comment, who have never been in a meaningful R&D role.
A good research culture is capable of listening to broad, generalized, completely accurate criticism in public and not downvote. Downvoting is your problem guys!
OpenAI has a million little haters out there and do you know how much time their people spend downvoting comments online? Zero. And honestly they’re paid way better than the poor souls who have wound up at Amazon, so it’s really, truly the case that none of this money money money culture really adds up to much for the little guy.
If there’s any one person to point the finger at - like why does Amazon, with its vast resources and tremendous talent, produce basically zero meaningful publicly influential research - it’s Jeff Bezos. You’re talking about strategy? The guy in charge is a colossal piece of shit, with a piece of shit girlfriend and a piece of shit world view, at least as bad as Larry Ellison, whose only redeeming factor is that MacKenzie Scott is a much smarter person than he ever was.
Why wouldn't consumer AI be a natural home for Apple?
Apple is constantly under blast for being slow to AI but if you look at the current state of AI, it feels like something Apple would never release -- the quality just isn't there. I don't necessarily think Apple only dipping their toes into AI is that poor of a decision right now. They still have the ability to blow the roof off the market with agents and device integration whenever the tech is far enough along to be trustworthy to the average consumer.
So unless Apple thinks it can outcompete it's BigTech competitors in something it historically hasn't done much of, best leave it to them.
This sounds like you’re either unfamiliar with what software they make or underestimate the complexity of things like a modern operating system. For example, most people would consider Swift hard, or the various Core frameworks, or things like designing a new modern file system and doing in place migrations on billion devices, etc.