Top
Best
New

Posted by bcye 3 hours ago

Big Data on the Cheapest MacBook(duckdb.org)
193 points | 161 comments
Robdel12 2 hours ago|
I’ve been tempted to buy one and do “real dev work” on it just to show people it’s not this handicapped little machine.

I built multiple iOS apps and went through two start up acquisitions with my M1 MBA as my primary computer, as a developer. And the neo is better than the M1 MBA. I edited my 30-45 min long 4k race videos in FCP on that air just fine.

embedding-shape 33 minutes ago||
> I built multiple iOS apps and went through two start up acquisitions with my M1 MBA as my primary computer, as a developer. And the neo is better than the M1 MBA. I edited my 30-45 min long 4k race videos in FCP on that air just fine.

Before I was a professional software developer, I used a scrawny second-hand laptop with a Norwegian keyboard (I'm not Norwegian) because that was what I could afford: https://i.imgur.com/1NRIZrg.jpeg

This was the computer I was developing PHP backends on + jQuery frontends, and where I published a bunch of projects that eventually led to me getting my first software development job, in a startup, and discovering HN pretty much my first day on the job :)

The actual hardware you use seems to me like it matters the least, when it comes to actually being able to do things.

rob 1 hour ago|||
It's starting to show its age, but I've been using a 2019 MacBook Pro with the Intel chip and 16GB of memory. Still handles multiple terminal sessions with Claude Code and Codex simultaneously, building in Xcode, running Docker in the background, etc.

(Maybe the fans sometimes sound like they're a jet engine taking off…)

Finally just put an order in for a new 16" MBP M5 Max with 48GB memory only because it looks like they're going to stop supporting the Intel stuff this year and no more software updates. It'll probably be obsolete in six months with the rate things are going, but I've been averaging seven years between upgrades so it should be good!

Robdel12 1 hour ago|||
Oh my. All I have to say is cherish the first week of your M* experience. :D When I got rid of my intel MBP (it was an i7) for my MBA it was astonishing how fast and smooth it was.

So, the m5 with 48gb of ram will be amazing.

ghaff 20 minutes ago||||
I use a 2015 MacBook Pro all the time--like right now. It does have 16GB of memory. It's what sits on my dining room table where I do most of my writing/browsing and which I take for travel. I do have an Apple Silicon MacBook Pro in my office but my downstairs "office" is a lot lighter and airier.
conception 12 minutes ago||||
Try running Teams on it though!
eru 50 minutes ago|||
Well, Claude Cod and Codex should be doing most of their heavy lifting in the cloud?
amonith 38 minutes ago|||
Sort of, they have no "hands", LLMs can only respond that they want to execute a tool/command. So they do that a lot to: read files, search for things, compile projects, run tests, run other arbitrary commands, fetch stuff from the internet etc.

Obviously the LLM inference is super heavy, but the actual work / task at hand is being executed on the device.

coldtea 27 minutes ago||||
The AI part yes. But they also use quite inefficient rendering on the cli.
exe34 37 minutes ago|||
yeah I run Claude code on a 2013 Mac book air that refuses to die, I don't think it's very compute heavy.
tjoff 1 hour ago|||
It will do real work fine. But slack and a browser will bring it to its knees.
Aurornis 1 hour ago|||
I have an older 8GB MacBook Air. This is false. I routinely have Slack, Chrome, iTerm, Visual Studio Code, and more open on it. It’s fine.

Those apps don’t need every single byte of memory you see in Activity Monitor to be active in RAM all of the time. The OS swaps out unused parts to the very fast SSD. If you push it so far that active pages are constantly being swapped out as apps compete then you start to notice, but the threshold for that is a lot higher than HN comments seem to think.

UqWBcuFx6NV4r 38 minutes ago||||
I’m sick to death of this. It’s so devoid from reality in 2026 that I see it as a lowest common denominator populist political catchphrase more than any legitimate contributor to any conversation. My min spec MacBook Pro from 6 years ago doesn’t flinch at this, and it barely flinches at a whole lot more.

Can we please just move on? Maybe get your hardware checked if you’re legitimately still having these issues.

NetMageSCW 26 minutes ago||||
You don’t have an 8GB Apple Silicon MacBook, so you? So why did you post?
boutell 1 hour ago||||
Only if you insist on running the standalone slack app for some reason. Why run one instance of Chrome when you can pay for two?
mikepurvis 1 hour ago||
I've been finding it hard to wean myself off the standalone app but another major reason to do so is opening threads in separate tabs. I find as soon as I'm involved in two or more conversations on there it's super easy to start losing track of things.
skybrian 1 hour ago|||
Maybe if you have 100 browser tabs or something silly like that?
alpaca128 57 minutes ago|||
A couple YouTube tabs are enough if you leave them running for long enough. Just one YT browser process will easily take up 1-4GB sooner or later.
NetMageSCW 25 minutes ago||
Or it won’t because Chrome and MacOS will know how much RAM is available and manage it effectively.
yohannparis 57 minutes ago||||
While I agree with your statement, I don't think judging one's way of working and using their computer was necessary.
eru 49 minutes ago|||
I could have two browser windows open in the late 1990s. I have about a thousand times as much RAM now. So even with 10x more bloat in the pages, I should be able to open 200 tabs just fine.
antonyh 12 minutes ago|||
It would have been a better fit for me than the M4 Air, I literally use it only for typing and browsing, plus a could of Mac-only tools. Brilliant machine but complete overkill for me. It's almost tempting to switch just to get rid of the display notch.
ramgine 1 hour ago|||
I just retired my m1 air to being a server this month. They’re very capable laptops. If the neo is even comparable in spec it’s excellent for the price
wincy 49 minutes ago|||
My m1 air with 1TB ssd and 16GB of ram is a little champion, I use it during travel to play indie games like Hades II or Slay the Spire, and it works really well, better than my Steam Deck which broke. The only issue it really has is when I try to plug it into my docking station it struggles mightily with 2 2K screens and a 4K screen, so I just use my desktop in that case.

I am jealous of my wife’s 13” M5 iPad Pro though, that oled screen is gorgeous, a wonder of modern engineering.

eru 47 minutes ago||
> [...] and it works really well, better than my Steam Deck which broke.

Well, the MacBook Air was also a lot more expensive than the Steam Deck?

Rohansi 15 minutes ago||
The M1 also consumes more power.
mettamage 1 hour ago||||
I just bought a second hand M1 64GB as my main work laptop, haha. They definitely are capable laptops
Robdel12 1 hour ago|||
Yeah! My M1 air is now my iOS build server since GH actions bill macOS mins at 10x the price.
gozzoo 17 minutes ago|||
How do you use M1 Air as iOS build server. Is 8G sufficient for only doing iOS builds? Do you connect to it remotely?

Couls you please describe your dev process.

bryanrasmussen 1 hour ago|||
why does GH actions bill macOS minis 10X?
jzebedee 1 hour ago|||
Mins here being short for minutes, not minis.
mikepurvis 58 minutes ago||
And, presumably for a combination of the Mac build (and hardware) being of niche interest and sitting outside the standard Linux workflows so it's annoying to administer. And serving a money-making audience (iOS app devs) who have a revenue stream and see the extra CI cost as worth it.
UqWBcuFx6NV4r 36 minutes ago|||
What is a macOS mini…
NetMageSCW 24 minutes ago||
Not “mini”, “mins” -> minutes.
clouedoc 1 hour ago|||
I know it's not really related, but how did you manage to build two startups worth getting acquired in such a short period of time?
asow92 35 minutes ago|||
I'm still doing iOS dev on my 2020 M1 MPB, and it's fine! I expect that if I change out its battery and apply new thermal paste it would run for another 6 years.
raegis 2 hours ago|||
Can you say a little more about what you mean by "better"? How much faster is editing?
swiftcoder 1 hour ago||
Better in terms of raw specs. The original M1 Air also came with 8GB of RAM, and the A18 Pro in the Neo is faster than the version of the M1 that shipped in the base model Air
anthonySs 1 hour ago|||
most dev workflows from pre 2021 can probably run just fine on a NEO - i think once you get into conductor / 8 terminals with claude code territory that’s where things start to slow down

i just got an m5 max with 128gb of ram specifically to run local llms

eru 46 minutes ago||
Does Claude Code take up that many local resources? I thought the heavy lifting was in the cloud?
ajross 1 hour ago|||
> I’ve been tempted to buy one and do “real dev work” on it just to show people it’s not this handicapped little machine.

But... you can do the same exercise with a $350 windows thing. Everyone knows you can do "real dev work" on it, because "real dev work" isn't a performance case anymore, hasn't been for like a decade now, and anyone who says otherwise is just a snob wanting an excuse to expense a $4k designer fashion accessory.

IMHO the important questions to answer are business side: will this displace sales of $350 windows machines or not, and (critically) will it displace sales of $1.3k Airs?

HN always wants to talk about the technical stuff, but the technical stuff here isn't really interesting. The MacBook Neo is indeed the best laptop you can get for $6-700.

But that's a weird price point in the market right now, as it underperforms the $1k "business laptops" (to avoid cannibalizing Air sales) and sits well above the "value laptop" price range.

prmph 1 hour ago||
No, you can't do real work on a $350 windows machine. No way such a setup is suitable for anything beyond browsing a tab or two and connecting to servers using SSH.

And, the whole shittiness of the experience will even distract you attempting real work: the horrible touchpad, the bad screen, the forced windows updates when you trying to start the machine to do something urgent, ads in Windows, the lack of proper programmability of Windows (unless you use WSL).... Add the fact that the toy is likely to break in a year or two. These issue exist on far more expensive Windows machines, how much more a $350 machine.

Leaving Windows machines and OS behind for more than a decade has been a continuing breath of fresh air. I have several issues with the Apple devices and macOS (as I have with Linux too), but on the whole they are far better than Windows. The only good thing about Windows that I miss on Macs is the file explorer and window management, not sure why Apple stubbornly refuses to copy those.

cosmic_cheese 48 minutes ago|||
A lot of $350-ish Windows machines also don’t have SSDs but instead eMMC storage, which is dog slow and will make modern SSD-mandatory Windows feel even more awful to use.

If Windows/Linux/x86 is non-negotiable and that’s your budget, I would never in a million years recommend anything brand new. This is when you go pick up a $350 used midrange ThinkPad on eBay. It won’t outperform a Neo in terms of CPU and battery life but I guarantee it’ll be a better experience than the garbage routinely sold at this price point.

ajross 35 minutes ago|||
> No, you can't do real work on a $350 windows machine.

Sigh. I mean, even absent the obvious answers[1], that's just wrong anyway. You're being a snob. Want to run WSL? Run WSL. Want to run vscode natively? Ditto. Put it on a cheap TV and run your graphical layout and 3D modelling work. I mean, obviously it does all that stuff. OBVIOUSLY, because that stuff is all cheap and easy.

All the complaining you're doing is about preference, not capability. You're being a snob. Which is hardly weird, we're all snobs about something.

But snobs aren't going to buy the Neo either. Again, the business question here is whether the $350 junk users can be convinced to be snobs for $600.

[1] "Put Linux on it", "All of your stuff is in the cloud anyway", "It's still a thousand times faster than the machine on which I did my best work", etc...

NetMageSCW 14 minutes ago||
You mean that machine from 30 years ago that was running 30 year old software that has nothing in common with today’s development? And how well does Linux run on 4GB?
coldtea 38 minutes ago|||
[flagged]
joe_mamba 34 minutes ago||
delete pls
embedding-shape 29 minutes ago|||
I dunno, I'm more afraid of the syndrome where people seemingly stop reading after a word or two, instead of reading/listening to the full context and then understanding that maybe they'll explain what they mean by that later on.

Besides, almost starting to feel like something LLMs cannot replicate as easily, being strongly worded, it's a bit harder to coax commercial LLMs to be "mean" towards people, so if someone is "strongly opinionated", it almost makes the comment feel more human-like. But I digress.

NetMageSCW 27 minutes ago|||
Sometimes the truth is succinct.
joe_mamba 22 minutes ago||
If someone would call Apple customers idiots, would they get banned for breaking HN rules? After all, they were the ones who kept buying generations of overpriced overheating intel garbage with keyboard that would break from spec of dust.

Then why is it OK to call other people, Mac Neo skeptics in this case, idiots?

MikeNotThePope 1 hour ago||
It’s fine to if you don’t have any memory hogging apps. But as soon as you fire up a couple demanding Docker containers you’ll feel the pain. 8GB isn’t so much RAM for some applications.
NetMageSCW 13 minutes ago||
Why do you think people buying the cheapest MacBook available will be running Docket? Do you commonly run Docker containers on the cheapest Windows laptop available? Why not?
scottlamb 13 minutes ago||
> The cloud instances have network-attached disks

Props for identifying the issue immediately, but armed with that knowledge, why not redo the benchmark on a different instance type that has local storage? E.g. why not try a `c8id.2xlarge` or `c8id.4xlarge` (which bracket the `c6a.4xlarge`'s cost)?

devnotes77 59 minutes ago||
The DuckDB team benchmarked with an r7i.16xlarge which uses EBS - that's the expected bottleneck. A fairer comparison would be an i4i or c8gd with local NVMe, where you'd likely see the laptop and cloud instance much closer in practice.
montroser 2 hours ago||
This is as much an indictment of AWS compute as it is anything else.
ipython 1 hour ago||
Kinda comparing apples to oranges. AWS was using EBS and not local instance storage. So you’re easily looking at another order of magnitude latency when transmitting data over the network versus a local pcie bus. That’s gonna be a huge factor in what I assume is a heavy random seek load.
raincole 2 hours ago|||
The article is literally saying the opposite. Quote:

> Here's the thing: if you are running Big Data workloads on your laptop every day, you probably shouldn't get the MacBook Neo.

> All that said, if you run DuckDB in the cloud and primarily use your laptop as a client, this is a great device

vasco 2 hours ago|||
But AWS beat the laptop? And there's no cost to performance analysis? Yes AWS is overpriced but how do you make that conclusion from this specific article? Because network disks were slower than SSDs? AWS also has SSD instances with local storage.
api 2 hours ago||
Yeah, this is really about how ludicrously overpriced big cloud is. I’ve got a first gen M1 Max and it destroys all but the largest cloud instances (that cost its entire current market value per month!), at least in compute. It’s a laptop! A decent bare metal server in a rack will destroy any laptop.

It’s staggering. Jaw dropping. Bandwidth is even worse, like 10000X markup.

Yet cloud is how we do things. There’s a generation or maybe two now of developers who know nothing but cloud SaaS.

I watched everyone fall for it in real time.

icedchai 53 minutes ago|||
With cloud, what you're really paying for is flexibility and scalability. You might not need either for your applications. At some startups, we needed it. We sized clusters wrong, needed to scale up in hours. This is something we wouldn't ever be able to do with our own hardware without tons of lead time.

If your application won't ever require more resources than a single server or two, then you are better off looking at other alternatives.

arh5451 1 hour ago||||
I agree and disagree, the benefit with cloud is you "don't need to manage it", it scales automatically, redundancy, and automatic backups etc. I do think you are right; in the future there will be more infrastructure as code as cost pressures become more obvious.
api 1 hour ago||
Those benefits are at least partly lies though.

The tooling — K8S with all its YAML, Terraform, Docker, cloud CLI tools, etc. — is pretty hideously ugly and complicated. I watch people struggle to beat it into shape just like they did with sysadmin automation tools like Puppet and Chef a decade or more ago. We have not removed complexity, only moved it.

The auto scaling thing is a half truth. It can do this if you deploy correctly but the zero downtime promise is only true maybe half the time. It also does this at greatly inflated cost.

Today you can scale with bare metal. Nobody except huge companies physically racks anymore. Companies like Hetzner and DataPacket have APIs to bring boxes up. There’s a delay, but you solve that by a bit of over provisioning. Very very few companies have work loads that are so bursty and irregular that they need full limitless up and down scaling. That’s one of those niche problems everyone thinks they have.

The uptime promise is false in my experience. Cloud goes down for cluster upgrades and any myriad other reasons just as often as self managed stuff. I’ve seen serious unplanned outages with cloud too. I don’t have hard numbers but I would definitely wager that if cloud is better for uptime at all it’s not enough of an improvement to justify that gigantic markup.

For what cloud charges I should, as the deploying user, receive five nines without having to think about it ever. It does not deliver that, and it makes me think about it a lot with all the complexity.

The only technical promise it makes good on, and it does do this well, is not losing data. They’ve clearly put more thought into that than any other aspect of the internal architecture. But there’s other ways to not lose data that don’t require you to pay a 10X markup on compute and a 10000X markup on transfer.

I think the real selling point of cloud is blame.

When cloud goes down, it’s not your fault. You can blame the cloud provider.

IT people like it, and it’s usually not their money anyway. Companies like it. They’re paying through the nose for the ability to tell the customer that the outage is Amazon’s fault.

Cloud took over during the ZIRP era anyway when money was infinite. If you have growth raise more. COGS doesn’t matter.

Maybe cloud is ZIRPslop.

cestith 1 hour ago||
Not all IaC is Kubernetes.
fridder 8 minutes ago|||
Honestly I think the best path is hybrid with the cloud as DR and sudden load scaling.
butILoveLife 49 minutes ago||
You could get a laptop with an Nvidia GPU, 16gb ram, 512 ssd... or a 'cheap' Macbook.

I totally understand if you need to compile for iphones. We need to make apps for the lower and middle class people that think a $40/mo cellphone is a status symbol. I get it.

But if you are not... why? I hate windows, but we have Fedora... and you get an Nvidia. Is it just a status symbol? And I have a hard time believing people who tell me stories about low power consumption, because no one had cared about that until Apple pretended people cared about it.

cosmic_cheese 42 minutes ago||
> And I have a hard time believing people who tell me stories about low power consumption, because no one had cared about that until Apple pretended people cared about it.

That’s because battery life was pretty mediocre across the board, with Apple occasionally squeaking out a bit of an upper hand on the Air. Most laptops were in the same boat, aside from gaming and workstation laptops but battery life has never been the point of those.

That changed dramatically with the M-series Macs. People didn’t start caring because Apple did, but because it meant no longer being tethered to a wall, being able to do a lot of outings without a brick or charger cable at all, and on extended trips being able to get by with a little phone charger instead of a the usual huge ungainly brick.

One of the primary objectives of a laptop is portability, and long battery life is an objective upgrade in that category. Not everybody needs it but for those who do it’s difficult to give up once you’ve had it.

EDIT: Another advantage of that higher efficiency is that MacBooks can run at full performance without being plugged in without it obliterating battery life. x86 laptops universally throttle when untethered and while this can be disabled, they burn through their batteries much more quickly.

hermanzegerman 39 minutes ago||
To be honest I never had Issues with Battery Life on my ThinkPad. I've gotten out 10-12 hours with the 96Wh Battery and never felt the need for much more.

I think most people who are so wowed by Macs bought just a garbage Windows Machine (e.g. almost everything from Asus and Acer) before and then splurged the money for a nice one, so obviously it's so much better in comparison.

cosmic_cheese 35 minutes ago|||
Is that under Windows or Linux? It’s not awful, but depending on usage a MacBook can do 16h+ with the same size battery, especially if put into low power mode, which is substantially better.

The smaller 13” Air also gets similar numbers despite its smaller battery, which is a big deal for people who don’t want to lug around a 15/16” laptop.

moduspol 29 minutes ago|||
How heavy is that ThinkPad?
dwedge 27 minutes ago|||
> And I have a hard time believing people who tell me stories about low power consumption, because no one had cared about that until Apple pretended people cared about it.

I haven't packed a charger for the day for 3 years. I can work in coffee shops or on the couch for over 6 hours without even thinking about charging. I'm sorry but if you haven't tried the M* macbooks you don't know what you're criticising.

kingnothing 36 minutes ago|||
Maybe it's an excellent experience these days, but every time I've tried Linux on desktop over the past 25 years I get burned. Maybe it works for a while, then your NIC driver gets borked and you spend 2 days trying to get it working again. Or some update goes sideways and you lose the GUI, launching only into a terminal. It's always something. And laptops have even less common hardware than desktops.

On the other hand, every Mac I've used over the past 15 years has been bulletproof. It turns on, it works, it runs *nix. It's an invisible interface to getting work done.

butILoveLife 28 minutes ago|||
I'd be willing to gamble that you used Debian-family. Debian is outdated linux. It is literally designed to be 2 years outdated upon release.

Use Fedora. Its up to date.

Note that Fedora is NOT Arch.

franktankbank 30 minutes ago|||
FWIW I have had no issues on a thinkpad for past 10 years running standard linux distros. I think it may come down to a combo of os and particular mb/laptop which is pretty easy to find recommendations.
nicbou 44 minutes ago|||
You get a long-lasting device that's usually pleasant to use. User experience is harder to measure than specs, but at least for me, Macbooks are consistently better laptops than everything else I've used.
eru 45 minutes ago|||
People care about how long you can run in between charging. Low power consumption helps with that, even if you don't care about it directly.
f6v 40 minutes ago|||
Yeah, but then MacBook is going to run smoother and faster than the Windows one (and I don’t want to spend even one extra minute on dealing with drivers on Linux). There’re just objective benchmarks for that.

And MacBooks also have a better display and build quality. Like, touchpad is still hit or miss on any non-Apple device.

UqWBcuFx6NV4r 32 minutes ago|||
Have you just woken up from a coma or have you literally just refused to evolve your “haha iPhones suck!” talking points from 2009? Move on. Nobody is going to engage with this.

I’m not surprised that the people in your social circles don’t care about being able to use a portable computer outside of their bedrooms. I assure you that in the real world battery life is front of mind for a great many people.

I’d also suggest not leaning so heavily on “but you get Nvidia!” when the crux of your gripe seems to be about brand hype. You are chasing a brand. You clearly do not know the first thing about the GPU/AI capabilities of the computers that you seem to hate so much.

plagiarist 19 minutes ago|||
Which laptop is this, with good Linux support?
hermanzegerman 46 minutes ago||
Where do I get a Laptop with a Nvidia GPU for 600$ ?

I'm right now in the Market for a new Laptop, because I need way more GPU Power than my T470 provides, and to be honest the MacBookPros are quite competitively priced compared to the P-ThinkPads with Nvidia Cards. (Both around 3000€) They also finally offer a matte screen option

The only thing holding me back right now is the soldered SSD, RAM (and shitty Linux support).

It was quite nice being able to upgrade RAM, SSD and replace the Battery on it. Otherwise it wouldn't have lasted for 9 years

__mharrison__ 1 hour ago||
When I teach, I use "big data" for data that won't fit in a single machine. "Small data" fits on a single machine in memory and medium data on disk.

Having said that duckDB is awesome. I recently ported a 20 year old Python app to modern Python. I made the backend swappable, polars or duckdb. Got a 40-80x speed improvement. Took 2 days.

ladberg 1 hour ago|
I'm curious - what were you doing that polars was leaving a 40-80x speedup on the table? I've been happy with it's speed when held correctly, but it's certainly easy to hold it incorrectly and kill your perf if you're not careful
devnotes77 57 minutes ago|||
Polars is fastest when you avoid eager eval mid-pipeline. If you see a 40x gap it's often from calling .collect() inside a loop or applying Python UDFs row-wise.
clamlady 1 hour ago||
as a broke ecologist, this little computer can do everything I need in R and word and is a phenomenal build for the price. I'm really enjoying it thus far.
pbronez 1 hour ago|
How did you get one already? I thought they were just up for pre-order
cluckindan 1 hour ago||||
Shipping started yesterday, meaning preorders would already have arrived then
clamlady 27 minutes ago|||
yea, preordered.
1a527dd5 1 hour ago||
I adore DuckDB.

Did a PoC on a AWS Lambda for data that was GZ'ed in a s3 bucket.

It was able to replace about 400 C# LoC with about 10 lines.

Amazing little bit of kit.

refactor_master 1 hour ago||
I think it’s relevant to first read [1] to see why they’re doing this. It’s basically done as a meme.

[1] https://motherduck.com/blog/big-data-is-dead/

mazzma 1 hour ago|
> An alternate definition of Big Data is “when the cost of keeping data around is less than the cost of figuring out what to throw away.”

That couldn't be more accurate

ody4242 2 hours ago|
I would have benchmarked with an instance that has local nvme, like c8gd.4xlarge.
devnotes77 1 hour ago||
Worth noting the c8gd local NVMe is ephemeral so you'd need to pre-stage the data each run, but for a benchmark like this that's actually ideal since you avoid EBS cold-read artifacts entirely.
namibj 2 hours ago||
Do they make any promises about persistence of local NVMe after something like a full-region power outage yet? Because if you can't do durable commit on a single-region cluster that will be just temporarily unavailable without loosing committed data if something like that happened, it's not quite there unless you still stream a WAL to storage that they do promise you will survive a full blackout of all zones that store (part of) the data.
LunaSea 1 hour ago||
You already lose your data after instance restart so I think that full region outage is already out of question.
ody4242 25 minutes ago||
Idk how an AWS region would respond to a power outage, but i have tested this in AWS Outpost, and there, if you power down a rack, then power it back again, the baremetal instances will not be recreated. (I was surprised as I was expecting the EC2 health check to terminate them, but it does not work like that.) My understanding is that if you stop/start an instance, your local storage is gone (as the instance might even end up in a different host), but if you just reboot the instance, it should keep the local storage.
More comments...