I built multiple iOS apps and went through two start up acquisitions with my M1 MBA as my primary computer, as a developer. And the neo is better than the M1 MBA. I edited my 30-45 min long 4k race videos in FCP on that air just fine.
Before I was a professional software developer, I used a scrawny second-hand laptop with a Norwegian keyboard (I'm not Norwegian) because that was what I could afford: https://i.imgur.com/1NRIZrg.jpeg
This was the computer I was developing PHP backends on + jQuery frontends, and where I published a bunch of projects that eventually led to me getting my first software development job, in a startup, and discovering HN pretty much my first day on the job :)
The actual hardware you use seems to me like it matters the least, when it comes to actually being able to do things.
(Maybe the fans sometimes sound like they're a jet engine taking off…)
Finally just put an order in for a new 16" MBP M5 Max with 48GB memory only because it looks like they're going to stop supporting the Intel stuff this year and no more software updates. It'll probably be obsolete in six months with the rate things are going, but I've been averaging seven years between upgrades so it should be good!
So, the m5 with 48gb of ram will be amazing.
Obviously the LLM inference is super heavy, but the actual work / task at hand is being executed on the device.
Those apps don’t need every single byte of memory you see in Activity Monitor to be active in RAM all of the time. The OS swaps out unused parts to the very fast SSD. If you push it so far that active pages are constantly being swapped out as apps compete then you start to notice, but the threshold for that is a lot higher than HN comments seem to think.
Can we please just move on? Maybe get your hardware checked if you’re legitimately still having these issues.
I am jealous of my wife’s 13” M5 iPad Pro though, that oled screen is gorgeous, a wonder of modern engineering.
Well, the MacBook Air was also a lot more expensive than the Steam Deck?
Couls you please describe your dev process.
i just got an m5 max with 128gb of ram specifically to run local llms
But... you can do the same exercise with a $350 windows thing. Everyone knows you can do "real dev work" on it, because "real dev work" isn't a performance case anymore, hasn't been for like a decade now, and anyone who says otherwise is just a snob wanting an excuse to expense a $4k designer fashion accessory.
IMHO the important questions to answer are business side: will this displace sales of $350 windows machines or not, and (critically) will it displace sales of $1.3k Airs?
HN always wants to talk about the technical stuff, but the technical stuff here isn't really interesting. The MacBook Neo is indeed the best laptop you can get for $6-700.
But that's a weird price point in the market right now, as it underperforms the $1k "business laptops" (to avoid cannibalizing Air sales) and sits well above the "value laptop" price range.
And, the whole shittiness of the experience will even distract you attempting real work: the horrible touchpad, the bad screen, the forced windows updates when you trying to start the machine to do something urgent, ads in Windows, the lack of proper programmability of Windows (unless you use WSL).... Add the fact that the toy is likely to break in a year or two. These issue exist on far more expensive Windows machines, how much more a $350 machine.
Leaving Windows machines and OS behind for more than a decade has been a continuing breath of fresh air. I have several issues with the Apple devices and macOS (as I have with Linux too), but on the whole they are far better than Windows. The only good thing about Windows that I miss on Macs is the file explorer and window management, not sure why Apple stubbornly refuses to copy those.
If Windows/Linux/x86 is non-negotiable and that’s your budget, I would never in a million years recommend anything brand new. This is when you go pick up a $350 used midrange ThinkPad on eBay. It won’t outperform a Neo in terms of CPU and battery life but I guarantee it’ll be a better experience than the garbage routinely sold at this price point.
Sigh. I mean, even absent the obvious answers[1], that's just wrong anyway. You're being a snob. Want to run WSL? Run WSL. Want to run vscode natively? Ditto. Put it on a cheap TV and run your graphical layout and 3D modelling work. I mean, obviously it does all that stuff. OBVIOUSLY, because that stuff is all cheap and easy.
All the complaining you're doing is about preference, not capability. You're being a snob. Which is hardly weird, we're all snobs about something.
But snobs aren't going to buy the Neo either. Again, the business question here is whether the $350 junk users can be convinced to be snobs for $600.
[1] "Put Linux on it", "All of your stuff is in the cloud anyway", "It's still a thousand times faster than the machine on which I did my best work", etc...
Besides, almost starting to feel like something LLMs cannot replicate as easily, being strongly worded, it's a bit harder to coax commercial LLMs to be "mean" towards people, so if someone is "strongly opinionated", it almost makes the comment feel more human-like. But I digress.
Then why is it OK to call other people, Mac Neo skeptics in this case, idiots?
Props for identifying the issue immediately, but armed with that knowledge, why not redo the benchmark on a different instance type that has local storage? E.g. why not try a `c8id.2xlarge` or `c8id.4xlarge` (which bracket the `c6a.4xlarge`'s cost)?
> Here's the thing: if you are running Big Data workloads on your laptop every day, you probably shouldn't get the MacBook Neo.
> All that said, if you run DuckDB in the cloud and primarily use your laptop as a client, this is a great device
It’s staggering. Jaw dropping. Bandwidth is even worse, like 10000X markup.
Yet cloud is how we do things. There’s a generation or maybe two now of developers who know nothing but cloud SaaS.
I watched everyone fall for it in real time.
If your application won't ever require more resources than a single server or two, then you are better off looking at other alternatives.
The tooling — K8S with all its YAML, Terraform, Docker, cloud CLI tools, etc. — is pretty hideously ugly and complicated. I watch people struggle to beat it into shape just like they did with sysadmin automation tools like Puppet and Chef a decade or more ago. We have not removed complexity, only moved it.
The auto scaling thing is a half truth. It can do this if you deploy correctly but the zero downtime promise is only true maybe half the time. It also does this at greatly inflated cost.
Today you can scale with bare metal. Nobody except huge companies physically racks anymore. Companies like Hetzner and DataPacket have APIs to bring boxes up. There’s a delay, but you solve that by a bit of over provisioning. Very very few companies have work loads that are so bursty and irregular that they need full limitless up and down scaling. That’s one of those niche problems everyone thinks they have.
The uptime promise is false in my experience. Cloud goes down for cluster upgrades and any myriad other reasons just as often as self managed stuff. I’ve seen serious unplanned outages with cloud too. I don’t have hard numbers but I would definitely wager that if cloud is better for uptime at all it’s not enough of an improvement to justify that gigantic markup.
For what cloud charges I should, as the deploying user, receive five nines without having to think about it ever. It does not deliver that, and it makes me think about it a lot with all the complexity.
The only technical promise it makes good on, and it does do this well, is not losing data. They’ve clearly put more thought into that than any other aspect of the internal architecture. But there’s other ways to not lose data that don’t require you to pay a 10X markup on compute and a 10000X markup on transfer.
I think the real selling point of cloud is blame.
When cloud goes down, it’s not your fault. You can blame the cloud provider.
IT people like it, and it’s usually not their money anyway. Companies like it. They’re paying through the nose for the ability to tell the customer that the outage is Amazon’s fault.
Cloud took over during the ZIRP era anyway when money was infinite. If you have growth raise more. COGS doesn’t matter.
Maybe cloud is ZIRPslop.
I totally understand if you need to compile for iphones. We need to make apps for the lower and middle class people that think a $40/mo cellphone is a status symbol. I get it.
But if you are not... why? I hate windows, but we have Fedora... and you get an Nvidia. Is it just a status symbol? And I have a hard time believing people who tell me stories about low power consumption, because no one had cared about that until Apple pretended people cared about it.
That’s because battery life was pretty mediocre across the board, with Apple occasionally squeaking out a bit of an upper hand on the Air. Most laptops were in the same boat, aside from gaming and workstation laptops but battery life has never been the point of those.
That changed dramatically with the M-series Macs. People didn’t start caring because Apple did, but because it meant no longer being tethered to a wall, being able to do a lot of outings without a brick or charger cable at all, and on extended trips being able to get by with a little phone charger instead of a the usual huge ungainly brick.
One of the primary objectives of a laptop is portability, and long battery life is an objective upgrade in that category. Not everybody needs it but for those who do it’s difficult to give up once you’ve had it.
EDIT: Another advantage of that higher efficiency is that MacBooks can run at full performance without being plugged in without it obliterating battery life. x86 laptops universally throttle when untethered and while this can be disabled, they burn through their batteries much more quickly.
I think most people who are so wowed by Macs bought just a garbage Windows Machine (e.g. almost everything from Asus and Acer) before and then splurged the money for a nice one, so obviously it's so much better in comparison.
The smaller 13” Air also gets similar numbers despite its smaller battery, which is a big deal for people who don’t want to lug around a 15/16” laptop.
I haven't packed a charger for the day for 3 years. I can work in coffee shops or on the couch for over 6 hours without even thinking about charging. I'm sorry but if you haven't tried the M* macbooks you don't know what you're criticising.
On the other hand, every Mac I've used over the past 15 years has been bulletproof. It turns on, it works, it runs *nix. It's an invisible interface to getting work done.
Use Fedora. Its up to date.
Note that Fedora is NOT Arch.
And MacBooks also have a better display and build quality. Like, touchpad is still hit or miss on any non-Apple device.
I’m not surprised that the people in your social circles don’t care about being able to use a portable computer outside of their bedrooms. I assure you that in the real world battery life is front of mind for a great many people.
I’d also suggest not leaning so heavily on “but you get Nvidia!” when the crux of your gripe seems to be about brand hype. You are chasing a brand. You clearly do not know the first thing about the GPU/AI capabilities of the computers that you seem to hate so much.
I'm right now in the Market for a new Laptop, because I need way more GPU Power than my T470 provides, and to be honest the MacBookPros are quite competitively priced compared to the P-ThinkPads with Nvidia Cards. (Both around 3000€) They also finally offer a matte screen option
The only thing holding me back right now is the soldered SSD, RAM (and shitty Linux support).
It was quite nice being able to upgrade RAM, SSD and replace the Battery on it. Otherwise it wouldn't have lasted for 9 years
Having said that duckDB is awesome. I recently ported a 20 year old Python app to modern Python. I made the backend swappable, polars or duckdb. Got a 40-80x speed improvement. Took 2 days.
Did a PoC on a AWS Lambda for data that was GZ'ed in a s3 bucket.
It was able to replace about 400 C# LoC with about 10 lines.
Amazing little bit of kit.
That couldn't be more accurate