Posted by bcye 7 hours ago
The “isn’t just” part is a dead giveaway almost always.
It would be a surprise if more than 0.1% of Macbook Neo users have even heard of DuckDB.
Which means that this article is probably just riding the hype.
People buy Macbook Neo because they "just need a laptop" or are budget conscious.
I imagine a student would get their hands wet with Postgre before looking at DuckDB or similar.
It would be a surprise if they do heavy workloads with DuckDB. In which case it's definitely worth investing in a more powerful computer.
Also there are countless reports of bricked M1 8GB MacBook Airs that are bricked because the SSD used up it's write cycles
https://www.macrumors.com/2021/02/23/m1-mac-users-report-exc...
If Apple would build their laptops serviceable like ThinkPads I would buy one today.
They’ve slowly been moving towards making it easier to repair individual broken parts. I’m very happy to see that a new keyboard doesn’t require replacing the entire top case. That was just crazy.
I just thought it was neat. It’s a phone chip, we’ve never been able to do stuff like this on an Apple phone chip before. No one was porting this to the iPhone to run there.
In my mind this is purely a curiosity article, and I like that.
There is always a trade-off of cost/convenience/power, and some folks are going to end up the the Neo end of the spectrum.
I guess they’re using a different definition?
very much so…
You have phones that are faster than cloud VMs of the past. You can use bare metal servers with up to 344 cores and 16TB of ram.
I used to share your definition too, but I now say that if it doesn’t open in Microsoft Excel, it’s big data.
As you say, single machines can scale up incredibly far. That just means 16 TB datasets no longer demand big data solutions.
Many people like to think they have big data, and you kinda have to agree with them if you want their money. At least in consulting.
Also you could go well beyond a 16TB dataset on a single machine. You assume that the whole uncompressed dataset has to fit in memory, but many workloads don’t need that.
How many people in the world have such big datasets to analyse within reasonable time?
Some people say extreme data.
Google has big data. You are not google.
Or one could define it as too big to fit on a single SSD/HDD, maybe >30TB. Still within the reach of a hobbyist, but too large to process in memory and needs special tools to work with. It doesn't have to be petabyte scale to need 'big data' tooling.
No.
>Do I reject a world where all of the above is necessary to realize value from an entry-level MacBook?
In theory, yes.