Posted by Amorymeltzer 1 day ago
I'm uncertain as to whether any M series mac will be performant enough and the M1/M2 mac mini's specifically, or whether there are features in the M3/M4/M5 architecture that make it worth my while to buy new.
Are these incremental updates actually massive in the model performance and latency space, or are they just as small or smaller?
I have an M4 and it is plenty fast enough. But honestly the local models are just not anywhere near the hosted models in quality, due to the lower parameter count, so I haven’t had much success yet.
The new pretty stuff feels a lot less magical when it lags or the UI glitches out. Apple sells fluidity and a seamless user experience. They need those bug fixes and an obsessive attention to detail to deliver on what is expected of their products.
- Some developer buys a new laptop
- Developer writes software (a browser)
- When the software works "fast enough" on their new laptop, they ship it
- The software was designed to work on the dev's new laptop, not my old laptop
- Soon the software is too bloated to work on my old laptop
- So I have to buy a new laptop to run the software
Before I'd buy a laptop because it had cool new features. But now the only reason I buy a new one is the new software crashes from too little RAM, or runs too slowly. My old laptops work just fine. All the old apps they come with work just fine. Even new native apps work just fine. But they can't run a recent browser. And you can't do anything without a recent browser.If our computers never got faster, we would still be able to do everything the same that we can do today. But we wouldn't have to put down a grand every couple years to replace a perfectly good machine.
If our computers never got faster, we would never get faster computers (obviously...) to run efficient code even faster. 3D rendering and physics simulation come to mind.
I have noticed what you mention over longer timescales (e.g. a decade). But it's mostly "flashy" software - games, trendy things... Which also includes many websites sadly - the minimum RAM usage for a mainstream website tab these days seems to be around 200MB.
Anecdata: My 12 year old desktop still runs Ubuntu+latest Firefox fine (granted, it probably wouldn't be happy with Windows, and laptops are generally weaker). Counter-anecdata: A friend's Mac Pro from many years ago can't run latest Safari and many other apps, so is quite useless.
I am so fed up of hearing this. I would love to optimise my code, but management will always prioritise features over optimisations because that is what drives sales. This happens at almost every company I've worked at.
Also more often than not, I have a huge problem even getting stuff working and having to wrangle co-workers who I have to suffer with that cannot do basic jobs, do not write test and in some cases I've found don't even run the code before submitting PRs. That code then get merged because "it looks good" when there is obvious problems that I can spot in some cases from literally the other side of the room.
The solution to that is a few decades old: plug-in a 3D rendering card. (Of course there's the whole system bus issue, but that's largely solved by a bigger bus, rather than a faster CPU and more system memory. 3d programs requiring more cpu/memory is largely software bloat)
A few decades ago there was a lot of research into system-level parallel processing. The idea was to just add more machines to scale up processing power (if needed). But because machines got faster, there was less need for it, so the research was mostly abandoned. We would all be using distributed OSes today if it weren't for faster machines.
Name a software that won’t run comfortably on my M1 MacBook Air, now 5 years old.