Posted by pvtmert 22 hours ago
AFP and Time Capsules add attack vectors to the OS, which can be targeted even when few users actively using them. One dev could keep both basically functional, but to what end? User counts are already small, and people that aren't using them are still exposed by their mere existence.
Shrinking or removing code, in my experience, is one of the biggest single wins you can have in software development. Less to test, less to update, less to secure.
Development is a finite resource, the argument here is to allocate them to hard-to-secure, outmoded, replaced, technology instead of anything future relevant. It doesn't make sense.
I have no doubt the bean counters have drawn up every kind of spreadsheet they can imagine trying to quantify it as being not worth it, but I don't think these kinds of quality of life things can be easily quantified, because each small thing maintained might only impact a small number of users but collectively, all of these kinds of small things add up to either a system with sharp corners that constantly papercuts the user (current Apple software), or one that is so seamless that it engenders customer loyalty for decades (old Apple software). This kind of shortsighted penny-pinching is how companies become a shell of their former selves, suffering a slow death-by-MBA.
This hypothetical employee would:
- update the TimeCapsule firmware from using AFP to using a brand new SMBv3 implementation, including both porting and making it "fit" within the constraints of 2013 hardware.
- be designing and implementing a migration system for both the TimeCapsule and the Mac to move to using the new implementation
- be responsible for all security analysis, QA, and documentation for the firmware and migration system
They also need to get it done by the first macOS version that has AFP removed, which will land in developer preview in six weeks and need to be feature complete in about 17 weeks.
If Apple hires a new developer capable of doing that, I don't want them to relegate them to supporting 13 year old hardware. I want them improving things that the majority of users actually need.
And that is the core problem with this sort of argument. Even with infinite money or the infinite possibilities of open source contributions, the availability of talent is still _always_ finite.
If Apple is known for anything, it's that they keep moving ahead with the operating system, even if it means leaving some users behind… and that goes back to the late 80's/early 90s when apps had to be "32-bit clean" [1] to run on System 7 and newer Motorola 68000 processors like the 68020, 68030, etc.
Some beloved apps don't make the transition, and that happens with every technology transition like 68000 to PowerPC, then to Intel, and then to ARM. And of course, from Classic Mac OS to OS X, Mac OS X then macOS.
I've been active in user groups since the Apple II days; there's a cohort who mostly won't upgrade their hardware but complain bitterly that they lack certain features. Or they attempt these fragile and unreliable hacks to keep their old hardware and software running.
Usually, they're doing themselves more harm than good, especially if they're not technical.
Also, it's pretty unlikely recent college graduates would be able to tackle old C++ or Objective-C code written before they were born, in some cases, to keep something like AFP alive. Regardless of Apple’s financial success, it's not a good use of resources to keep a bespoke network protocol going that originated in 1985 that less than 1% of the installed base is actively using.
[1]: https://en.wikipedia.org/wiki/Classic_Mac_OS_memory_manageme...
cf Linux removing old network drivers this week for the same reason (without the hand-wringing that this Apple announcement is getting!)
Apple's source is not public, but the protocol is still fully documented if someone wanted to create a new client and server. https://developer.apple.com/library/archive/documentation/Ne...
However, they'd be better off just creating a driver and server around the open source Netatalk implementation.
I mean, it's basically just like a time machine backup plus, uh, a little bit of some older files that I don't want to keep on my main Mac.
seems like any NAS would take way more space than I would love to. I suppose one alternative would be actually getting some kind of like Beelink PC and then maybe setting up a proper home server, moving some of my side projects in there, running plex from it. The problem is that the current ram prices, it's a surprisingly expensive solution.
Look, my setup works for me. Just add an option to re-enable AFP [2].
You could shuck the disk and use it directly, though. Then it's just a disk, not a time capsule.
TFA:
> Apple made SMB its primary file-sharing protocol in OS X 10.9 Mavericks, over 12 years ago, and has repeatedly told us that support for its predecessor AFP will be removed in the future.
- BigCo already is a zero-sum deal, they use Xcode-cloud as a service, which runs back on their servers anyway... (Google, Amazon, Azure, etc)
- It was not a long-standing product. Introduced somewhere around 2016~ish if I remember correctly. Only lasted a few major releases. Easier to kill than an established one (ie. TimeMachine)
They aren’t deprecating Time Machine. The old protocol is being removed.
The old protocol hasn’t worked well for a long time, at least in my experience
Also it's honestly really weird that they don't have iCloud backups for Macs yet. It seems like a no-brainer feature. I know I would easily switch to Apple over Backblaze as Backblaze's client is just terrible.
I've been working on improving an open source menubar that wraps restic. Right now it is a bit rough around the edges, but my plan is to have a simple onboarding experience for various backend services like B2.
Over the weekend, I added a "Smart backups" feature that uses all the same directories that the backblaze menubar app and timemachine excludes. This was the primary missing feature for me. It even generates and backups your Brewfile...
And then I went to Acronis True Image backing up to my Synology NAS, but that became unreliable too - oftentimes when I'd go to do a restore, the client would crash trying to read the catalog.
So, like you... CCC nightly to my Synology, with a Snapshot rotation on it - snapshot the previous night's backup at 8pm, and then kick off that night's backup at 11pm.
I've loopback mounted disk images over network filesystems for many years without any recurring issues outside of macOS. It's not rocket science, particularly if you have a reliable network connection.
I'm aware there's a long tail of possible issues that can come up, but most of the complaints I've seen amount to "I have a reliable connection and Time Machine is still a tire fire", which suggests that the problem exists outside of that particular set of edge cases.
(It seems to genuinely be that nobody at Apple really cares about network filesystems at this point - people in this thread talking up AFP makes me want to look at migrating _to_ using it for my mac's backups, because SMB on macOS randomly drops or hangs for no reason and Time Machine at least twice has just started stating the backup was completely unreadable, leading to me having to restore the backup filesystem from backups.
And attempting to use NFS on macOS somehow makes everything three times as buggy, like they special cased SMB shares to not be touched in some random "touch everything synchronously" calls throughout the OS but didn't do it with NFS shares, so Finder will now take seconds or minutes to do things that shouldn't involve that share, but as soon as you remove it, it stops doing so.)
The "new computer" out of box account creation and first sign in experience on both Windows 11 and MacOS are clearly designed to drive end users towards perpetual for life monthly recurring subscriptions for (Microsoft 365 Personal, OneDrive, iCloud storage, etc).
Imagine the difficulty for the ordinary non technical person (absolutely not a stereotypical HN reader) ever being able to stop paying for iCloud when they have 600GB+ of their family photos and videos and stuff backed up to it.
To be fair, non technical folks get a lot of value from this scheme too. I can't imagine many of my relatives successfully juggling backups and external media in a way that would actually keep their content safe in case their phone is lost/stolen/destroyed.
Right now the monthly fees for this stuff are rather modest, but I could see a future where the dominant players lock out competitors and use their market position to raise prices significantly.
I bought a UNAS-2 (and a couple of 12 TB IronWolf Pro drives) a few months ago when the "time capsule will not be supported in a future version of macOS" warning first appeared. It has been outstanding alongside the rest of my UniFi setup, and perfectly supports Time Machine backups. The UniFi Identity macOS app means my family's computers always stay authenticated/connected and my wife & kids don't have to do anything to make Time Machine just work.
If you're a power user who loves the Apple aesthetic and you already have a UniFi setup at home, you'll feel right at home switching from Time Capsule to a UNAS.
Also why the 12TB ironwolf drives specifically ? Personally I always was a fan of buying true enterprise (the ones designed for "online" or near line storage) but sometimes specific models and sizes of random drives do very well in Backblaze testing
As for IronWolf Pro drives, I chose them because they seem to have similar longevity to enterprise drives with less noise (my equipment is in a closet under the stairs).