Top
Best
New

Posted by pvtmert 22 hours ago

Networking changes coming in macOS 27(eclecticlight.co)
231 points | 211 commentspage 2
akabul0us 14 hours ago|
When i saw the headline I briefly allowed myself to hope that DNS settings would no longer be set universally (requiring manual intervention when switching networks if not using DHCP) but of course it's nothing useful and only "Apple is breaking stuff because they can"
celeryd 6 hours ago||
Finally, TLS 1.2 is baseline, after having been released 18 years ago.
amelius 20 hours ago||
Can't they hire an extra dev per abandoned project to not abandon it?
Someone1234 20 hours ago|
You greatly under-estimate how much work it is to maintain old code, particularly to maintain in securely.

AFP and Time Capsules add attack vectors to the OS, which can be targeted even when few users actively using them. One dev could keep both basically functional, but to what end? User counts are already small, and people that aren't using them are still exposed by their mere existence.

Shrinking or removing code, in my experience, is one of the biggest single wins you can have in software development. Less to test, less to update, less to secure.

applfanboysbgon 19 hours ago|||
Yes, writing and maintaining less code is great for a developer. We can follow this to the logical extreme and marvel at how easy it is to write and maintain a program whose only function is to print "hello, world" to the console. Nevermind the users, what do they matter?
Someone1234 17 hours ago||
By the very nature of assigning development time to these antiquated features, you're assigning them away from other features, bug fixes, or requests that may have a larger user reach.

Development is a finite resource, the argument here is to allocate them to hard-to-secure, outmoded, replaced, technology instead of anything future relevant. It doesn't make sense.

applfanboysbgon 16 hours ago||
The person was specifically suggesting hiring extra developers for maintenance. While I'm familiar with the concept that "nine women can't birth a baby in a month", I don't think that applies so much to maintenance of old code paths. Apple makes over $100b in net profit per year, a truly unfathomable amount of money, they can afford it, and I think not only can they afford it but that it would benefit them. Even if only 1% of your users use X, for Apple that might translate to perhaps 10 million people using X, or at 0.1% 1 million. Hiring a dev to improve the experience for that many people just makes sense at scale, software is write-once reproduce-a-million-times-for-free.

I have no doubt the bean counters have drawn up every kind of spreadsheet they can imagine trying to quantify it as being not worth it, but I don't think these kinds of quality of life things can be easily quantified, because each small thing maintained might only impact a small number of users but collectively, all of these kinds of small things add up to either a system with sharp corners that constantly papercuts the user (current Apple software), or one that is so seamless that it engenders customer loyalty for decades (old Apple software). This kind of shortsighted penny-pinching is how companies become a shell of their former selves, suffering a slow death-by-MBA.

dwaite 4 hours ago|||
My estimate is that your lower count of people who could still be using Time Capsule is off by a factor of 20, but we'll continue with the idea that Apple could justify hiring a single engineer to be assigned full-time on the TimeCapsule, starting today.

This hypothetical employee would:

- update the TimeCapsule firmware from using AFP to using a brand new SMBv3 implementation, including both porting and making it "fit" within the constraints of 2013 hardware.

- be designing and implementing a migration system for both the TimeCapsule and the Mac to move to using the new implementation

- be responsible for all security analysis, QA, and documentation for the firmware and migration system

They also need to get it done by the first macOS version that has AFP removed, which will land in developer preview in six weeks and need to be feature complete in about 17 weeks.

If Apple hires a new developer capable of doing that, I don't want them to relegate them to supporting 13 year old hardware. I want them improving things that the majority of users actually need.

And that is the core problem with this sort of argument. Even with infinite money or the infinite possibilities of open source contributions, the availability of talent is still _always_ finite.

alwillis 15 hours ago|||
> Even if only 1% of your users use X, for Apple that might translate to perhaps 10 million people using X, or at 0.1% 1 million. Hiring a dev to improve the experience for that many people just makes sense at scale; software is write-once, reproduce-a-million-times-for-free.

If Apple is known for anything, it's that they keep moving ahead with the operating system, even if it means leaving some users behind… and that goes back to the late 80's/early 90s when apps had to be "32-bit clean" [1] to run on System 7 and newer Motorola 68000 processors like the 68020, 68030, etc.

Some beloved apps don't make the transition, and that happens with every technology transition like 68000 to PowerPC, then to Intel, and then to ARM. And of course, from Classic Mac OS to OS X, Mac OS X then macOS.

I've been active in user groups since the Apple II days; there's a cohort who mostly won't upgrade their hardware but complain bitterly that they lack certain features. Or they attempt these fragile and unreliable hacks to keep their old hardware and software running.

Usually, they're doing themselves more harm than good, especially if they're not technical.

Also, it's pretty unlikely recent college graduates would be able to tackle old C++ or Objective-C code written before they were born, in some cases, to keep something like AFP alive. Regardless of Apple’s financial success, it's not a good use of resources to keep a bespoke network protocol going that originated in 1985 that less than 1% of the installed base is actively using.

[1]: https://en.wikipedia.org/wiki/Classic_Mac_OS_memory_manageme...

zimpenfish 19 hours ago|||
> You greatly under-estimate how much work it is to maintain old code, particularly to maintain in securely.

cf Linux removing old network drivers this week for the same reason (without the hand-wringing that this Apple announcement is getting!)

saghm 19 hours ago||
Is the code that Apple is removing support for open source? The Linux drivers could at least plausibly be picked up and used by someone who really wants to, so it doesn't seem to be a fair comparison
dwaite 4 hours ago||
The AFP protocol was deprecated in 2013. The AFP server was removed in Big Sur, so over five years ago. This is removal of AFP client support.

Apple's source is not public, but the protocol is still fully documented if someone wanted to create a new client and server. https://developer.apple.com/library/archive/documentation/Ne...

However, they'd be better off just creating a driver and server around the open source Netatalk implementation.

sgt 6 hours ago||
Completely unrelated but I love the layout / blog format of eclecticlight.co
ymolodtsov 15 hours ago||
I'm still using my time capsule. I don't really trust the hard drive inside of it, but I basically use it to connect to an SSD that I attached to it. Unfortunately, Nest Wi-Fi, that I use as a router doesn't have any USBs, unlike some cheaper routers. I know that it's, I know that it will be gone after Tahoe. I'm still not sure what I'm going to do about this. I mean, I don't really want to fool on us

I mean, it's basically just like a time machine backup plus, uh, a little bit of some older files that I don't want to keep on my main Mac.

seems like any NAS would take way more space than I would love to. I suppose one alternative would be actually getting some kind of like Beelink PC and then maybe setting up a proper home server, moving some of my side projects in there, running plex from it. The problem is that the current ram prices, it's a surprisingly expensive solution.

ecliptik 16 hours ago||
This update broke my workflow! I use Netatalk [1] with AFP to share files between my Macintosh 512ke and MacBook via AppleTalk.

Look, my setup works for me. Just add an option to re-enable AFP [2].

1. https://github.com/Netatalk/netatalk

2. https://xkcd.com/1172/

mushufasa 21 hours ago||
Wouldn't the TimeCapsules still work over wired connections, just like any other hard drive, even if the networking AFP protocol support is dropped?
ibejoeb 20 hours ago|
No, afp is application layer. It doesn't matter how the device is connected at layers 1 or 2.

You could shuck the disk and use it directly, though. Then it's just a disk, not a time capsule.

apparatur 21 hours ago||
Next: macOS iCloud backups and the eventual deprecation of local Time Machine backups altogether. More services revenue!
GeekyBear 21 hours ago||
Changing out the network protocol used for local network backups isn't the same thing as getting rid of local network backups.

TFA:

> Apple made SMB its primary file-sharing protocol in OS X 10.9 Mavericks, over 12 years ago, and has repeatedly told us that support for its predecessor AFP will be removed in the future.

apparatur 21 hours ago||
Hence "next". And by local I meant directly connected drives.
dwaite 4 hours ago||
If the pattern continues, they'll announce deprecation this fall and remove the feature in 2039.
angott 21 hours ago|||
I don’t think they’re going to drop support for local backups any time soon. There are lots of enterprise customers relying on Time Machine who will never switch to iCloud. TM can also be configured via MDM settings and is a really common solution for Mac IT administrators, so it would take ages to deprecate it.
apparatur 21 hours ago||
"There are a lot of enterprise customers using Xcode server". And poof, it's gone and there's now only the Xcode cloud service. It would not take ages. It would take a single release which no longer supports it. Complaints? Keep using the old one or subscribe.
plorkyeran 20 hours ago|||
I am fairly confident in saying that approximately zero enterprise customers used Xcode server. It was extremely limited and targeted at small shops which didn't see the need for a proper CI setup but had an extra machine sitting around to run builds on.
pvtmert 19 hours ago|||
I think they switched to cloud because;

- BigCo already is a zero-sum deal, they use Xcode-cloud as a service, which runs back on their servers anyway... (Google, Amazon, Azure, etc)

- It was not a long-standing product. Introduced somewhere around 2016~ish if I remember correctly. Only lasted a few major releases. Easier to kill than an established one (ie. TimeMachine)

Aurornis 21 hours ago|||
They switched the default protocol from AFP to SMB a long time ago.

They aren’t deprecating Time Machine. The old protocol is being removed.

The old protocol hasn’t worked well for a long time, at least in my experience

bananamogul 20 hours ago|||
People have been asking for iCloud macOS backups since iCloud was introduced. It would be very popular. I'm not sure why Apple doesn't offer this, because it's easy revenue.
post-it 20 hours ago||
Because people will fill their iClouds. An important value proposition of iCloud is that customers pay for more space than they need. Time Machine grows to fill all available space.
pmontra 17 hours ago||
They could sell a separate service for Time Machine backups. I'm not an Apple customers so I don't know if it makes sense, but they could make customers pay X times the last N days in the backup plus Y times a number M of snapshots in the past.
post-it 15 hours ago||
I wouldn't pay for it, so that's one data point.
fragmede 11 hours ago||
I would, so that's a second data point.
bayindirh 21 hours ago|||
As long as you can migrate/recover your Mac from your TM backup, I guess that this scenario won't happen.
kalleboo 9 hours ago|||
I would have agreed if they hadn't put in the engineering effort to upgrade the backup disk image to APFS instead of HFS+. They wouldn't have done that if the plan was to deprecate it soon. (IIRC the next version of macOS is also dropping HFS+ support)

Also it's honestly really weird that they don't have iCloud backups for Macs yet. It seems like a no-brainer feature. I know I would easily switch to Apple over Backblaze as Backblaze's client is just terrible.

latchkey 19 hours ago|||
I like having control over my backups.

I've been working on improving an open source menubar that wraps restic. Right now it is a bit rough around the edges, but my plan is to have a simple onboarding experience for various backend services like B2.

Over the weekend, I added a "Smart backups" feature that uses all the same directories that the backblaze menubar app and timemachine excludes. This was the primary missing feature for me. It even generates and backups your Brewfile...

https://github.com/lookfirst/ResticScheduler

AlexandrB 21 hours ago|||
The story of TimeMachine is a tragedy: a revolutionary feature that made backups accessible for normal people allowed to lie fallow for a decade or more until it's as annoying and unreliable as anything else. I now use Carbon Copy Cloner to avoid the TM headaches.
rudcodex 16 hours ago|||
Good nudge to look into using CCC. Which folders do you backup? It seems slower than TM so thinking of backing up home folder only
FireBeyond 21 hours ago|||
I never found it to be overly reliable. It was reliable... for a while. Then would silently fail/stop working, or just tell you that it had stopped working and that whatever you had in it was no longer accessible.

And then I went to Acronis True Image backing up to my Synology NAS, but that became unreliable too - oftentimes when I'd go to do a restore, the client would crash trying to read the catalog.

So, like you... CCC nightly to my Synology, with a Snapshot rotation on it - snapshot the previous night's backup at 8pm, and then kick off that night's backup at 11pm.

tonyedgecombe 18 hours ago|||
It was unreliable over SMB. Not surprising when you look at what it was doing. It would create a virtual drive on the share, map that and backup to it. There was too much going on for that to be reliable.
rincebrain 3 hours ago||
Not really.

I've loopback mounted disk images over network filesystems for many years without any recurring issues outside of macOS. It's not rocket science, particularly if you have a reliable network connection.

I'm aware there's a long tail of possible issues that can come up, but most of the complaints I've seen amount to "I have a reliable connection and Time Machine is still a tire fire", which suggests that the problem exists outside of that particular set of edge cases.

(It seems to genuinely be that nobody at Apple really cares about network filesystems at this point - people in this thread talking up AFP makes me want to look at migrating _to_ using it for my mac's backups, because SMB on macOS randomly drops or hangs for no reason and Time Machine at least twice has just started stating the backup was completely unreadable, leading to me having to restore the backup filesystem from backups.

And attempting to use NFS on macOS somehow makes everything three times as buggy, like they special cased SMB shares to not be touched in some random "touch everything synchronously" calls throughout the OS but didn't do it with NFS shares, so Finder will now take seconds or minutes to do things that shouldn't involve that share, but as soon as you remove it, it stops doing so.)

apparatur 21 hours ago||||
For me it was a key DB file inside the Photo library which Time Machine omitted from all backups and prevented me from restoring the library. Not fun.
AlexandrB 20 hours ago|||
Yeah, you may be right. I have fond memories of it from around 2008, but those might be from the initial experience and not all the "you need to recreate your back from scratch" errors that would crop up after a while.
semiquaver 21 hours ago|||
This is reflexive and ill-considered FUD. Be better.
gjvc 20 hours ago||
also known as "prescient"
walrus01 21 hours ago|||
> Next: macOS iCloud backups and the eventual deprecation of local Time Machine backups altogether. More services revenue!

The "new computer" out of box account creation and first sign in experience on both Windows 11 and MacOS are clearly designed to drive end users towards perpetual for life monthly recurring subscriptions for (Microsoft 365 Personal, OneDrive, iCloud storage, etc).

Imagine the difficulty for the ordinary non technical person (absolutely not a stereotypical HN reader) ever being able to stop paying for iCloud when they have 600GB+ of their family photos and videos and stuff backed up to it.

AlexandrB 20 hours ago||
> Imagine the difficulty for the ordinary non technical person (absolutely not a stereotypical HN reader) ever being able to stop paying for iCloud when they have 600GB+ of their family photos and videos and stuff backed up to it.

To be fair, non technical folks get a lot of value from this scheme too. I can't imagine many of my relatives successfully juggling backups and external media in a way that would actually keep their content safe in case their phone is lost/stolen/destroyed.

Right now the monthly fees for this stuff are rather modest, but I could see a future where the dominant players lock out competitors and use their market position to raise prices significantly.

shaguoer 21 hours ago||
[flagged]
TimTheTinker 21 hours ago|
Ubiquiti is really taking up the slack in some areas Apple has abandoned.

I bought a UNAS-2 (and a couple of 12 TB IronWolf Pro drives) a few months ago when the "time capsule will not be supported in a future version of macOS" warning first appeared. It has been outstanding alongside the rest of my UniFi setup, and perfectly supports Time Machine backups. The UniFi Identity macOS app means my family's computers always stay authenticated/connected and my wife & kids don't have to do anything to make Time Machine just work.

If you're a power user who loves the Apple aesthetic and you already have a UniFi setup at home, you'll feel right at home switching from Time Capsule to a UNAS.

jshier 19 hours ago||
What format is the destination drive? My ideal is APFS clone backups to a remote drive, but I don't know if there are any network setups that support that, even though you can do it to a local drive.
nijave 17 hours ago||
I was under the impression that's how SMB TimeMachine backups work currently
Melatonic 20 hours ago||
Have you tried it also working to backup files from Linux and windows machines ? Was hoping for a good mixed backup solution and I'm getting Ubiquiti would deliver here.

Also why the 12TB ironwolf drives specifically ? Personally I always was a fan of buying true enterprise (the ones designed for "online" or near line storage) but sometimes specific models and sizes of random drives do very well in Backblaze testing

TimTheTinker 19 hours ago||
I don't have any Linux/Windows machines, but I've seen nothing that would dissuade me from using it when I eventually migrate my current laptop to Asahi Linux.

As for IronWolf Pro drives, I chose them because they seem to have similar longevity to enterprise drives with less noise (my equipment is in a closet under the stairs).