Posted by chilipepperhott 14 hours ago
Once we ran two weeks with a "poisoned" document that was crushing any server it was uploaded to. The user kept the tab open just working like nothing was happening. Then, we found the bug, but in theory we may have made the entire cluster restart all the time. Apart from electricity consumption, that would hardly change anything. The load and syncing time would be worse, but not by much.
With local-first, everything keeps working even without the server.
Here is the 2012 engine, by the way:
[0] https://apps.apple.com/us/app/reflect-track-anything/id64638...
1. No network latency, you do not have to send anything across the atlantic.
2. Your get privacy.
3. Its free, you do not need to pay any SaaS business.
An additional would be, the scale being built-in. Every person has their own setup. One central agency doesn't have to take care of all.
Then sure.
Some people have been running Office 2003 on everything from Windows XP to Windows 10. Assuming they bought the license 22 years ago, that's pretty cheap, probably $15 per year. As a bonus, they've never had their workflow disrupted.
While the modern world of mobile devices and near-permanent fast and reliable connectivity has brought some real advantages, it has also brought the ability for software developers to ruthlessly exploit their users in ways no-one would have dreamt of 20 or 30 years ago. Often these are pitched as if they are for the user’s benefit — a UI “enhancement” here, an “improved” feature there, a bit of casual spying “to help us improve our software and share only with carefully selected partners”, a subscription model that “avoids the big up-front cost everyone used to pay” (or some questionable logic about “CAPEX vs OPEX” for business software), our startup has been bought by a competitor but all the customers who chose our product specifically to avoid that competitor’s inferior alternative have nothing to worry about because they have no ulterior motive and will continue developing it just the way we have so far.
The truth we all know but don’t want to talk about is that many of the modern trends in software have been widely adopted because they make things easier and/or more profitable for software developers at the direct expense of the user’s experience and/or bank account.
And related, if there is no service, then the service cannot fail to secure your data.
No possibility of a billing or login error, where you lose access to your stuff because they just think your're not current or valid when you are.
No possibility of losing access because the internet is down or your current location is geo blocked etc.
No possibility of your account being killed because they scanned the data and decided it contained child porn or pirated movies or software or cad files to make guns or hate speech etc.
Those overlap with free and privacy but the seperate point is not the money or the privacy but the fact that someone else can kill your stuff at any time without warning or recourse.
And someone else can lose your stuff, either directly by having their own servers broken into, or indirectly, by your login credentials getting leaked on your end.
You, on the other hand, are much more likely to do one or both of these things to yourself.
The fact that a hard drive can break and you can fail to have a backup is not remotely in the same class of problem of living at the whim of a service provider.
1. A lot of good models require an amount of VRAM that is only present in data center GPUs.
2. For models which can run locally (Flux, etc.), you get dramatically different performance between top of line cards and older GPUs. Then you have to serve different models with different sampling techniques to different hardware classes.
3. GPU hardware is expensive and most consumers don't have GPUs. You'll severely limit your TAM if you require a GPU.
4. Mac support is horrible, which alienates half of your potential customers.
It's best to follow the Cursor model where the data center is a necessary evil and the local software is an adapter and visualizer of the local file system.
These are two entirely separate paradigms. In many instances it is quite literally impossible to depend on models reachable by RF like in an ultra-low power forest mesh scenario for example.
But for consumer software that can be internet connected, data center GPU is dominating local edge compute. That's simply because the models are being designed to utilize a lot of VRAM.
Your TV likely has a good enough CPU to run a decent model for home automation. And a game console most definitely does.
I'd love to see a protocol that would allow devices to upload a model to a computer and then let it sleep until a command it received. Current AI models are really self-contained, they don't need complicated infrastructure to run them.
All the server has to do then is serve binaries, all the business logic is in the client.
Multi-user app (and if we're talking about companies, it's multiple users by the very definition) where users are not trusted almost always needs either a central service with all the access controls, or a distributed equivalent of it (which is, indeed, very hard to implement). “Local-first” in those cases becomes less relevant, it’s more of a “on-premises/self-host” in this case.
But I think while end-user non-business software can be small compared to enterprise stuff, it is still a fairly big market with lots of opportunities.
Anything that requires sharing information with other users is also a pain in the neck, as you basically need to treat your internal logic like a proprietary, potentially hostile, file format.
There are situations where it's relevant, but I don't think it's as many as you say
Pirates is the cost of doing business. Just ignore them.
No need to make this too complicated.
Source: me, I do this way.
There are a lot of cracking groups that circumvent license servers on day one with software that have license servers.
I'm sure this is the reason that Adobe went to the cloud, Adobe couldn't ignore them as with other 'box software'.
I think this is too broad a stroke to paint with. There's local-first software that still connects to the cloud for additional features. Local-first can enable you to continue to work when offline, but the software can still be more useful when online.
But over time and multiple hard-to-recover incidents they switched to cloud.
Now that cloud hype has died down, I don't see why subscription based would not be viable just because your product runs locally (assuming that all your competitors are already subscription based). ZBrush started selling local first subscriptions, so I guess we'll see soon enough whether that works out for them.
My understanding was that they switched to being centralized because phones couldn't run the decentralized version.
Did I misunderstand this part? A lot of local software is sold as one time purchase downloads.
How does the reason you provide support the idea you provide it in support of? There are an infinite number of things that are sold as single purchases that you buy by just navigating to a website where you make the purchase.
There are an infinite number of things that are sold as single purchases on CDs that you buy by just navigating to a website where you make the purchase.
So much truth to this post.
Paid hosted software is easier to scale financially.
Without the latter, it's very hard to come up with the money to pay people to build, to support the market, etc.
Take an application like Slack and consider how to scale it local-first for a team with 1000 people.
And then consider how to coordinate deployment of new features and schema migrations…
Compare that with running a centralized server, which is going to be much easier.
Bars.