Top
Best
New

Posted by summarity 5 days ago

Apple will phase out Rosetta 2 in macOS 28(developer.apple.com)
165 points | 162 comments
0x0 3 hours ago|
I'm guessing they don't want to maintain and build and test x86_64 versions of all the macos libraries like Appkit and UIKit (including large changes like liquid glass) when they are no longer shipping x86_64 macOS versions. Which is not entirely unreasonable as I'm sure it takes a lot of effort to keep the whole ui library stack working properly on multiple archs.

Perhaps that's what they're hinting about with the note about a "subset of Rosetta". So maybe there is hope that the core x86_64 binary translator will stick around for things like VM and emulation of generic (linux? wine?) binaries, but they don't want to maintain a whole x86_64 macOS userspace going forward.

Space savings from not shipping fat binaries for everything will probably also be not insignificant. Or make room for a new fat binary for a future "arm64v2" :)

swiftcoder 2 minutes ago||
> So maybe there is hope that the core x86_64 binary translator will stick around for things like VM and emulation of generic (linux? wine?) binaries

It's mostly for their game-porting toolkit. They have an active interest in Windows-centric game developers porting their games to Mac, and that generally doesn't happen without the compatibility layer.

saagarjha 2 hours ago|||
It’s basically just a recompile though.
0x0 38 minutes ago|||
I'm sure there's lots of x86_64 specific code in the macOS userland that is much more than just a recompile - things like safari/javascriptcore JIT, various quartz composer core animation graphics stack and video encoder decoder stack libraries, as well as various objective-c low level pointer tagging and message passing ABI shenanigans and so on. This is probably why 32bit intel mac app support was dropped pretty hard pretty fast, as the entire runtime and userland probably required a lot of upkeep. As just one example, 32bit intel objective-c had "fragile instance variables" which was a can of worms.
saagarjha 20 minutes ago||
This is <1% of the total code that Apple writes
WanderPanda 2 hours ago||||
Until it isn't
RossBencina 2 hours ago|||
Can you enable TSO for ARM executables?
saagarjha 20 minutes ago||
Yes but I don't see how that is relevant
zaphirplane 2 hours ago||
It’s not like they were doing it to make me happy, they are doing it to sell Mac and lock people into the Apple ecosystem. Maybe there is a negligible % of people using it, possible m1 is 6 yrs old iirc
rs186 21 minutes ago||
Closer to 5 years old
luizfelberti 16 hours ago||
They barely just released Containerization Framework[0] and the new container[1] tool, and they are already scheduling a kneecapping of this two years down the line.

Realistically, people are still going to be deploying on x64 platforms for a long time, and given that Apple's whole shtick was to serve "professionals", it's really a shame that they're dropping the ball on developers like this. Their new containerization stuff was the best workflow improvement for me in quite a while.

[0] https://github.com/apple/containerization

[1] https://github.com/apple/container

pjmlp 2 hours ago||
Apple has always been like this, there are other options when backwards compatibility is relevant feature.
jack_tripper 21 minutes ago||
That's why like 80%+(?) of corporate world runs Windows client side for their laptops/workstations. They don't want to have to rewrite their shit whenever the OS vendor pushes an update.

Granted, that's less of an issue now with most new SW being written in JS to run in any browser but old institutions like banks, insurances, industrial, automation, retail chains, etc still run some ancient Java/C#/C++ programs they don't want to, or can't update for reasons but it keeps the lights on.

Which is why I find it adorable when people in this bubble think all those industries will suddenly switch to Macs.

mxey 16 hours ago||
The OP says nothing about Rosetta for Linux.
luizfelberti 16 hours ago||
It seems to talk about Rosetta 2 as a whole, which is what the containerization framework depends on to support running amd64 binaries inside Linux VMs (even though the kernel still needs to be arm)

Is there a separate part of Rosetta that is implemented for the VM stuff? I was under the impression Rosetta was some kind of XPC service that would translate executable pages for Hypervisor Framework as they were faulted in, did I just misunderstand how the thing works under the hood? Are there two Rosettas?

mxey 16 hours ago|||
I cannot tell you about implementation difference but what I mean is that this only talks about Rosetta 2 for Mac apps. Rosetta for Linux is a feature of the Virtualization framework that’s documented in a completely different place. And this message says a part of Rosetta for macOS will stick around, so I would be surprised if they removed the Linux part.

On the Linux side, Rosetta is an executable that you hook up with binfmt to run AMD64 binaries, like how you might use Wine for windows binaries

watermelon0 1 hour ago||
Rosetta Linux executable can be used without host hardware/software support; for example, you can run it on AWS's Graviton instances.

However, to get performance benefits, you still need to have hardware support, and have Rosetta installed on macOS [1].

TFA is quite vague about what is being deprecated.

[1] https://developer.apple.com/documentation/virtualization/run...

klausa 5 hours ago|||
The "other" part of Rosetta is having all system frameworks being also compiled for x86_64, and being supported running in this configuration.
Rochus 11 minutes ago||
This is very frustrating. As if they couldn't afford to continue it. And at the same time they keep making the system more and more closed, so that you can't even run applications without Apple's permission. I don't understand why people still buy such products.
t_sawyer 17 hours ago||
Well this kinda screws me over running docker on macos. Not all images I use have an arm version.
swiftcoder 1 minute ago||
This isn't about the virtualisation support - it's about all the Mac system frameworks being available in the rosetta environment
physicsguy 16 hours ago|||
https://github.com/apple/container

They released this a while ago which has hints of supporting amd64 beyond the Rosetta end date.

dur-randir 4 hours ago||
Believing in hints from Apple about software? Sweet summer child.
khalic 50 minutes ago||
Still waiting for ZFS on OS X
mxey 17 hours ago|||
It doesn’t say if that is going away. The message calls out another part as sticking around:

> Beyond this timeframe, we will keep a subset of Rosetta functionality aimed at supporting older unmaintained gaming titles, that rely on Intel-based frameworks.

Since the Linux version of Rosetta requires even less from the host OS, I would expect it to stay around even longer.

wmwragg 17 hours ago|||
Yes that was my first thought as well, and as the images aren't designed to be run on a mac specifically, like a native app might be, there is no expectation for the developers to create a native apple silicon version. This is going to be a pretty major issue for a lot of developers
TimTheTinker 5 hours ago|||
Case in point - Microsoft's SQL Server docker image, which is x86-only with no hint of ever being released as an aarch64 image.

I run that image (and a bunch of others) on my M3 dev machine in OrbStack, which I think provides the best docker and/or kubernetes container host experience on macOS.

anon7000 16 hours ago||||
I’ve worked in DevOps and companies I’ve worked for put the effort in when M1 came out, and now local images work fine. I honestly doubt it will have a huge impact. ARM instances on AWS, for example, are much cheaper, so there’s already lots of incentive to support ARM builds of images
avhception 16 hours ago||
In our small shop, I definitely made sure all of our containers supported aarch64 when die M1 hit the scene. I'm a Linux + Thinkpad guy myself, but now that I've got an x13s, even I am running the aarch64 versions!
mxey 16 hours ago||
How do you build multi-arch in CI? Do you cross-compile or do you have arm64 runners?
spockz 4 hours ago|||
It depends. Mostly it is choosing the right base image architecture. For rust and golang we can trivially cross compile and just plunk the binary in the appropriate base image. For JVM based apps it is the same because we just need to have the jars in the right place. We can do this on either architecture.

The only hold out is GraalVM which doesn’t trivially support cross compilation (yet).

avhception 2 hours ago|||
We're mostly a PHP / JS (with a little Python on the side) shop, so for our own code it's mostly a matter of the right base image. Building our own images is done on an x86-64 machine, with the aarch64 side of things running via qemu.
mxey 17 hours ago|||
Apple Silicon is ARM64 which is supported by Linux and Docker.
watermelon0 16 hours ago|||
But Docker images don't necessarily have ARM64 support. If you are exclusively targeting x64 servers, it rarely makes sense to support both ARM64 and AMD64 platforms for development environment/tests, especially if the product/app is non-trivial.
Gigachad 3 hours ago||
I guess now it makes sense. Got 3 years to turn on ARM builds.
coldtea 3 hours ago||
No, it still doesn't make sense.

And it looks like Rosetta 2 for containers will continue to be supported past macOS 28 just fine. It's Rosetta 2 for Mac apps that's being phased out, and not even all of that (they'll keep it for games that don't need macOs frameworks to be kept around in Intel format).

coldtea 3 hours ago||||
Parent doesn't want to merely run ARM64 Linux/Docker images. They want to run Intel images. Lots of reasons for that, from upstream Docker images not available to ARM64, to specific corporate setups you want to replicate as close as possible, or who aren't portable to ARM64 without huge effort.
wmwragg 16 hours ago||||
I'm aware, I use ARM images all the time, I was trying to indicate that the usual refrain that the developers have had years to migrate their software to apple silicon, doesn't really apply to docker images. It's only the increase in use of ARM elsewhere (possibly driven by the great performance of macs running apple silicon) which has driven any migration of docker images to have ARM versions
wmf 16 hours ago||||
Yeah but many people are using x86-64 Docker images because they deploy on x86-64. Maybe ARM clouds will be more common by that time.
wmwragg 16 hours ago|||
Yep, this is another reason I've needed the use of x86-64 images, as although they should be technically the same when rebuilt for ARM, they aren't always, so using the same architecture image which is run in production, will sometimes catch edge case bugs the ARM version doesn't. Admittedly it's not common, but I have had it happen. Obviously there is also the argument that the x86-64 image is being translated, so isn't the same as production anyway, but I've found that to have far less bugs than the different architecture
MangoToupe 6 hours ago||
> Obviously there is also the argument that the x86-64 image is being translated, so isn't the same as production anyway

I've never seen this make a practical difference. I'm sure you can spot differences if you look for them (particularly at the hardware interface level) but qemu has done this for decades and so has apple.

mxey 16 hours ago|||
Many container images are multi-arch, although probably not ones that are built in-house.
avhception 16 hours ago||
We built our in-house images multi-arch precisely for this reason!
BurnTheBoss 16 hours ago|||
That's not really the point though right? It means that pulling and using containers that are destined for x86 will require also building arm64 versions. Good news is buildx has the ability to build arm64 on x86, bad news is people will need to double up their build steps, or move to arm in production.
Snelius 8 hours ago||
W/o rossetta i can't build x86_64 images anymore. Today i can setup OrbStack amd64 linux and build native amd64 images on my mac to put on my servers.
coldtea 3 hours ago||
What they talk about is Rosetta's macOS frameworks compiled for Intel being kept around (which macOS Intel apps use, like if you run some old <xxx>.app that's not available for Apple Silicon).

The low-level Rosetta as a translation layer (which is what containers use) will be kept, and they will even keep it for Intel games, as they say in the OP.

hakube 3 hours ago|||
Doesn't Orbstack or Colima solve this?
p0w3n3d 3 hours ago||
if you run x86 code without rosetta (probably using the qemu) it will work painfully slow
saagarjha 2 hours ago|||
That won’t be going away, none of that requires any support from the host OS.
tobyjsullivan 6 hours ago|||
How does this work currently? I was under the impression that Docker for Mac already ran containers in an x86 VM. Probably outdated info, but I’m curious when that changed.
ChocolateGod 15 hours ago|||
Surely, as it is on Linux, QEMU can take over here in running the x86 images on ARM.

Is it slow? Absolutely. But you'd be insane to run it in production anyway.

coldtea 3 hours ago||
Wanting it to be fast is not just about "running it on production".

A test suite that becomes 10x slower is already a huge issue.

That said, it doesn't seem llike Rosetta for container use is going anywhere. Rosetta for legacy Mac applications (the macOS level layer) is.

lostlogin 16 hours ago|||
Are you running this via that travesty of a desktop app?
juancn 16 hours ago||
Back to the QEMU dark ages
markus_zhang 17 hours ago||
Ah, I guess it was wise for the original developer of Rosetta 2 to quit earlier this year. One of the people that I look up to.

https://news.ycombinator.com/item?id=42483895

cwzwarich 16 hours ago|
My reasons for leaving Apple had nothing to do with this decision. I was already no longer working on Rosetta 2 in a day-to-day capacity, although I would still frequently chat with the team and give input on future directions.
casualscience 4 hours ago|||
Just went through that thread, I can't believe this wasn't a team of like 20 people.

It's crazy to me that apple would put one guy on a project this important. At my company (another faang), I would have the ceo asking me for updates and roadmaps and everything. I know that stuff slows me down, but even without that, I don't think I could ever do something like this... I feel like I do when I watch guitar youtubers, just terrible

I hope you were at least compensated like a team of 20 engineers :P

nxobject 43 minutes ago||
History doesn't repeat, but it does rhyme: the initial (re)bootstrapping of OS X for Intel was done by one person, too.

https://www.quora.com/Apple-company/How-does-Apple-keep-secr...

markus_zhang 16 hours ago||||
Thank you for the clarification!
ta9000 5 hours ago|||
Thank you for your work!
gumboshoes 4 days ago||
Seems premature. My scanner software, SnapScan, still regularly updated, requires Rosetta. Abbyy FineReaser, the best Mac OCR, requires Rosetta. Although they may be related, as the SnaScan software does OCR with the FineReader engine.
caseyohara 17 hours ago||
The M1 chip and Rosetta 2 were introduced in 2020. macOS 28 will be released in 2027. 7 years seems like plenty of time for software vendors to make the necessary updates. If Apple never discontinues Rosetta support, vendors will never update their software to run natively on Apple chips.
linguae 17 hours ago|||
This is also consistent with Apple’s previous behavior with backwards compatibility, where Apple would provide a few years of support for the previous platform but will strongly nudge developers and users to move on. The Classic environment in Mac OS X that enabled classic Mac OS apps to run didn’t survive the Intel switch and was unavailable in Leopard even for PowerPC Macs, and the original Rosetta for PowerPC Mac OS X applications was not included starting with Lion, the release after Snow Leopard.
delusional 16 hours ago||
Honestly, for apple this is above and beyond. They've killed support with less fanfare and compatibility support than what we see here.
bigyabai 16 hours ago||
Bully on me for owning hardware and expecting it to behave consistently across OTA updates.
renewiltord 5 hours ago|||
I think you probably should not buy Apple hardware. It is not a guarantee they have ever offered that their software would behave consistently across updates. If this mattered to me, I would have done some research and rapidly found out that Apple has done this every few years for the last 30 years.
delusional 16 hours ago||||
The hardware isn't (as far as I'm aware) changing. Please don't move the goalposts for hardware ownership (we just be able to do with our hardware as we please) to also include indefinite support from vendors. That just makes us looks like childish crybabies.

If you were instead asking for hardware documentation, or open-sourcing of Rosetta once sunset, then we're on the same team.

bigyabai 13 hours ago||
I never asked for an infinite window of software support, though. I merely want the features that I had when I bought the laptop, for as long as the OS supports my machine. The response is always "blame the third-parties" when apps break, but oftentimes the devs already made their money and moved on. The onus is on Apple to support their OS' software if they want to have my money.

Open-sourcing is one solution, but knowing Apple it's not a likely one. Their "we know best" mindset is why I quit dailying Macs entirely - it's not sustainable outside the mobile dev business. A computer that supports 32-bit binaries, OpenGL or x86 translation when you bought it should be able to retain that capability into the future. Anything less is planned obselecense, even if you want to argue there's a silver lining to introducing new tech. New tech should be competitive on-merits, not because it's competitor was forcibly mutilated.

xethos 13 hours ago|||
> The onus is on Apple to support their OS' software if they want to have my money

Apple has done this exact same thing for every architecture change and every API they sunset, but you gave them your money anyways. Their history with discontinuing software support and telling users to harang third-party devs isn't exactly a secret.

rowanG077 13 hours ago|||
Sure, then don't update.
searls 16 hours ago|||
At what point in history have you owned a particular piece of hardware for use with a particular piece of never-to-be-updated software and installed a major OEM operating system release a full 7 years after release without issue?

I doubt such a thing has ever happened in the history of consumer-facing computing.

Kwpolska 13 hours ago|||
Have you ever heard of Windows? Unlike Apple, they do care about backwards compatibility, and don’t randomly go removing features users depend on.
iknowstuff 5 hours ago||
and the consequences are dire
Rohansi 2 hours ago||
Are they? IMO Windows going downhill has more to do with what is being added to it than what it is preserving compatibility for.
csdreamer7 15 hours ago||||
> At what point in history have you owned a particular piece of hardware for use with a particular piece of never-to-be-updated software and installed a major OEM operating system release a full 7 years after release without issue?

Linux users do it all the time with WINE/Proton. :-)

Before you complain about the term 'major OEM operating system'; Ubuntu is shipped on major OEMs and listed in the supported requirements of many pieces of hardware and software.

> I doubt such a thing has ever happened in the history of consumer-facing computing.

Comments like this show how low standards have fallen. Mac OS X releases have short support lengths. The hardware is locked down-you need a massive RE effort just to get Linux to work. The last few gens of x86 Mac hardware did not have as much, but it was still locked down. M3 or M4 still do not have a working installer. None of this is funded by Apple to get it working on Linux or to get Windows ARM working on it as far as I know.

In comparison, my brother in-law found an old 32bit laptop that had Windows 7. It forced itself without his approval to update to Windows 10. It had support for 10 years from Microsoft with just 10. 7 pushed that 10 to... hmm... 13+ years of support?

nomel 15 hours ago||
> Linux users do it all the time with WINE/Proton. :-)

And there’s a near 100% chance you’ll have to recompile/download pre-re-compiled binaries if moving to a completely different architecture. Same here.

Dylan16807 20 minutes ago||
Not the same here. The user didn't have to get different binaries when they changed hardware, and that was a big selling point for the hardware. And now it's going to break in an arbitrary software update.
palmotea 13 hours ago||||
> At what point in history have you owned a particular piece of hardware for use with a particular piece of never-to-be-updated software and installed a major OEM operating system release a full 7 years after release without issue?

> I doubt such a thing has ever happened in the history of consumer-facing computing.

Come on. I've done that and still do: I use an ancient version of Adobe Acrobat that I got with a student discount more than 10 years ago to scan documents and manipulate PDFs. I'd probably switch to an open source app, if one were feature comparable, but I'm busy and honestly don't have the time to wade through it all (and I've got a working solution).

Adobe software is ridiculously overpriced, and I'm sure many, many people have done the same when they had perpetual-use licenses.

gucci-on-fleek 4 hours ago|||
> At what point in history have you owned a particular piece of hardware [...] and installed a major OEM operating system release a full 7 years after release without issue?

A few years ago, I installed Windows 10 on a cheap laptop from 2004—the laptop was running Windows XP, had 1GB of memory, a 32-bit-only processor, and a 150GB hard drive. The computer didn't support USB boot, but once I got the installer running, it never complained that the hardware was unsupported.

To be fair, the computer ran horrendously slow, but nothing ever crashed on me, and I actually think that it ran a little bit faster with Windows 10 than with Windows XP. And I used this as my daily driver for about 4 months, so this wasn't just based off of a brief impression.

out_of_protocol 17 hours ago||||
Windows 95 was released... well, in 1995. In 2025 you can run apps targeting W95 just fine (and many 16-bit apps with some effort)
slavapestov 16 hours ago|||
> In 2025 you can run apps targeting W95 just fine (and many 16-bit apps with some effort)

FWIW, Windows running on a 64-bit host no longer runs 16-bit binaries.

jack_tripper 10 minutes ago|||
>Windows running on a 64-bit host no longer runs 16-bit binaries.

Which isn't an issue since Windows 95 was not a 16-bit OS, that was MS-DOS. For 16-bit DOS apps there's virtualization things like DOSbox or even HW emulators.

out_of_protocol 16 hours ago|||
Yes. Still, there are ways to do it anyway, from Dosbox to WineVDM. Unlike MacOS where having even 32 bit app (e.g. half of Steam games that supported Macos to begin with) means you're fucked
nomel 14 hours ago||
You can use dosbox and x86 virtual machines just fine in macOS (with the expected performance loss) right now, without Rosetta. macOS is still Turing complete.
out_of_protocol 13 hours ago||
Technically speaking, you can run anything on anything since stuff Turing complete. Practically speaking however....

E.g. i have half of macos games in my steam library as a 32-bit mac binaries. I don't know a way to launch them at any reasonable speed. Best way to do it is to ditch macos version altogether and emulate win32 version of the game (witch will run at reasonable speed via wine forks). Somehow Win32 api is THE most stable ABI layer for linux & mac

nomel 12 hours ago||
> my steam library as a 32-bit mac binaries. I don't know a way to launch them at any reasonable speed.

To be fair, it's the emulation of x86-32 with the new ARM64 architecture that causes the speed problems. That transition is also why MacBooks are the best portables, in terms of efficiency, that you can buy right now.

All ARM chips have crippled x86-32 performance, because they're not x86-32 chips. You'll find the same (generally worse) performance issues trying to run ARM64 code with x86-64.

astrange 2 hours ago||
Rosetta 2 is pretty good at running x86-32. There's more registers on the destination, after all.
K7PJP 17 hours ago||||
This isn't a new or unique move; Apple has never prioritized backwards compatibility.

If you're a Mac user, you expect this sort of thing. If running neglected software is critical to you, you run Windows or you keep your old Macs around.

torstenvl 16 hours ago||
It's a bizarre assumption that this is about "neglected software."

A lot of software is for x64 only.

If Rosetta2 goes away, Parallels support for x64 binaries in VMs likely goes away too. Parallels is not neglected software. The x64 software you'd want to run on Parallels are not neglected software.

This is a short-sighted move. It's also completely unprecedented; Apple has dropped support for previous architectures and runtimes before, but never when the architecture or runtime was the de facto standard.

https://docs.parallels.com/parallels-desktop-developers-guid...

galad87 16 hours ago|||
Paralles x86_64 emulation doesn't depend on Rosetta.
mxey 16 hours ago|||
> If Rosetta2 goes away, Parallels support for x64 VMs likely goes away too.

Rosetta 2 never supported emulating a full VM, only individual applications.

torstenvl 16 hours ago||
You're right. It looks like the new full VM emulation in 20.2 doesn't use Rosetta.

https://www.parallels.com/blogs/parallels-desktop-20-2-0/

Nevertheless, running x64 software including Docker containers on aarch64 VMs does use Rosetta. There's still a significant valid use case that has nothing to do with neglected software.

Edited my post above. Thanks for the correction.

mrpippy 5 hours ago||
The OP only applies to Rosetta for running x64 Mac apps, not running x64 Linux software in aarch64 Linux VMs.
Klonoar 16 hours ago||||
Just because Microsoft does one thing doesn't mean Apple has to do the same.
reddalo 16 hours ago||||
That's not a good thing for other reasons; e.g. there are a lot of inconsistencies in modern Windows, like pieces of Windows 3.1 still in Windows 11.
Kwpolska 12 hours ago||
There are leftovers from older versions of macOS and severely neglected apps in Tahoe too. Sure, they might have been given a new icon, or adopted the new system styling, but they have not been updated for ages.
delusional 16 hours ago||||
There's a lot of Win95 software that you can't run too. Microsoft puts a lot of work into their extensive backlog of working software. It's not just "good engineering" it's honest to god fresh development.
stalfosknight 17 hours ago|||
That's not necessarily a good thing.
watermelon0 16 hours ago||||
The main problem is not native software, but virtualization, since ARM64 hardware is still quite uncommon for Windows/Linux, and we need Rosetta for decent performance when running AMD64 in virtual machines.
spacechild1 6 hours ago|||
There is lots of existing software (audio plugins, games, etc.) that will never see an update. All of that software will be lost. Most new software has ARM or universal binaries. If some vendors refuse to update their software, it's their problem. Windows still supports 32-bit applications, yet almost all new software is 64-bit.
joshuat 16 hours ago|||
I think this is exactly what they're issuing this notice to address. Rosetta performs so well that vendors are pretty okay just using it as long as possible, but a two year warning gives a clear signal that it's time to migrate.
jayd16 4 hours ago||
If it's ok now then what's even the problem with letting it be?
poemxo 16 hours ago|||
I usually agree with Apple but I don't agree with this. Rosetta 28 is basically magic, why would they take away one of their own strongest features? If they want big name apps to compile to Apple Silicon, why can't they exert pressure through their codesigning process instead?
drob518 16 hours ago|||
The “big name apps” have already moved to Apple Silicon. Rosetta helped them with that process a few years ago. We’re down to the long tail apps now. At some point, Rosetta is only helping a couple people and it won’t make sense to support it. I just looked, and right now on my M1 Air, I have exactly one x86 app running, and I was honestly surprised to find that one (Safari plug-in). Everything else is running ARM. My workload is office, general productivity, and Java software development. I’m sure that if you allow your Mac to report back app usage to Apple, they know if you’re using Rosetta or not, and if so, which apps require it. I suspect that’s why they’re telegraphing that they are about ready to pull the plug.
prewett 16 hours ago||
How do you check if you're running any x86 apps?
brycewray 16 hours ago|||
1. From the Apple menu, click "About This Mac."

2. In the resulting window, click the "More Info..." button. This will open the System Settings window.

3. Scroll to the bottom of that window and click "System Report."

4. In the left side of the resulting window, under "Software," click "Applications." This will provide a list of installed applications. One of the columns for sorting is "Kind"; all apps that are x86 will be listed with the kind, "Intel."

timsneath 5 hours ago||||
In macOS 26, you can see every Rosetta app that has recently run on your machine by going to System Information and then Software / Rosetta Software. It includes the "Fallback Reason" (e.g. if you manually forced the app under Rosetta or if it was an Intel-only binary).
drob518 15 hours ago||||
To see what’s running,

1. Go into Activity Monitor

2. From the CPU or memory tab, look at the “Kind” column. It’ll either say “Apple” or “Intel.” If the Kind column isn’t visible, right-click on the column labels and select Kind.

thijsvandien 6 hours ago|||
There's this Silicon app that scans your disk for them: https://github.com/DigiDNA/Silicon.
nomel 14 hours ago|||
How much die area does it use that could be used for performance? How much engineering time does it use? Does it make sense to keep it around, causing ~30% more power usage/less performance?

There are many acceptable opposing answers, depending on the perspective of backwards compatibility, cost, and performance.

My naive assumption is that, by the time 2027 comes around, they might have some sort of slow software emulation that is parity to, say, M1 Rosetta performance.

free_bip 14 hours ago||
Rosetta is a software translation layer, not a hardware translation layer. It doesn't take any die space.
nomel 14 hours ago||
Hardware acceleration [1]:

> One of the key reasons why Rosetta 2 provides such a high level of translation efficiency is the support of x86-64 memory ordering in the M1 SoC. The SoC also has dedicated instructions for computing x86 flags.

[1] https://en.wikipedia.org/wiki/Rosetta_(software)

electroly 11 hours ago||
While true, we're not talking about the chips losing TSO; Apple plans to keep Rosetta 2 for games and it has to remain fast because, well, it's video games. It also seems like they plan to keep their container tool[1]. This means they can't get rid of TSO at the silicon level and I have not heard this discussed as a possibility. We're only discussing the loss of the software support here. The answer to "How much die area does it use that could be used for performance?" is zero--they have chosen to do a partial phase-out that doesn't permit them to save the die space. They'd need to kill all remaining Rosetta 2 usage in order to cull the die space, and they seem to be going out of their way not to do this.

[1] https://github.com/apple/container -- uses Rosetta translation for x64 images.

lloeki 3 hours ago|||
> We're only discussing the loss of the software support here

Schematically "Rosetta 2" is multiple things:

- hardware support (e.g TSO)

- binary translation (AOT + JIT)

- fat binaries (dylibs, frameworks, executables)

- UI (inspector checkbox, arch(1) command, ...)

My bet is that beyond the fancy high-level "Rosetta 2" word what will happen is that they'll simply stop shipping fat x86_64+aarch64 system binaries+frameworks[0], while the remainder remains.

[0]: or rather, heavily cull

astrange 2 hours ago||||
So, the way to "use die area for performance" is to add more cache and branch predictor space. Because of this, anything that costs a lot of code size does consume it because it's using the cache up.
nomel 11 hours ago|||
> Rosetta is a software translation layer, not a hardware translation layer. It doesn't take any die space.

There is hardware acceleration in place that that only exists for it to, as you just stated, give it acceptable performance.

It does take up die space, but they're going to keep it around because they've decided to reduce the types of applications supported by Rosetta 2 (and the hardware that it exists only for it) will support.

So, seems like they've decided they can't fight the fact that gaming is a Windows thing, but there's no excuse for app developers.

electroly 11 hours ago||
Sure, this seems to be a restatement of my post, which started with "While true...", rather than a disagreement. I was pointing out which one of the "many acceptable opposing answers" Apple had chosen. They can't use that die area for performance because they're still using it even after this phase-out. (I'm not the person who wrote the original post.)
eisa01 16 hours ago|||
You can most likely use Vuescan, I use that with an old ScanSnap i500 (or something)

[1] https://www.hamrick.com

ZeWaka 16 hours ago||
Love VueScan for my film scanner!
morshu9001 4 hours ago|||
Owning a Mac has always meant not relying on 3P software. Forget printer/scanner drivers. Even if they target macOS perfectly, there will come a day when you need to borrow a Windows PC or old Mac to print.

It happens to be ok for me as a SWE with basic home uses, so their exact target user. Given how many other people need their OS to do its primary job of running software, idk how they expect to gain customers this way. It's good that they don't junk up the OS with absolute legacy support, but at least provide some kind of emulation even if it's slow.

al_borland 16 hours ago|||
They were pretty quick to sunset the PPC version of Rosetta as well. It forces developers to prioritize making the change, or making it clear that their software isn’t supported. It

The one I have my eye on is Minecraft. While not mission critical in anyway, they were fairly quick to update the game itself, but failed to update the launcher. Last time I looked at the bug report, it was close and someone had to re-open it. It’s almost like the devs installed Rosetta2 and don’t realize their launcher is using it.

mrpippy 5 hours ago||
Rosetta for PPC apps was supported from the first Intel Macs released in January 2006 until 10.7 Lion was released in July 2011.
sixothree 17 hours ago|||
I spent what I would consider to be a lot of money for a unitasker Fujitsu scanner device and am just astounded by how unmaintained and primitive the software is. I only use it on a Windows machine though, so I'm not in the same boat.
bitwize 17 hours ago||
This is Apple's "get your shit together and port to ARM64, you have 2 years" warning.

If you're not willing to commit to supporting the latest and greatest, you shouldn't be developing for Apple.

nasretdinov 17 hours ago||
This seems to basically only apply to full-fledged GUI apps and excludes e.g. games, so potentially stuff like Rosetta for CLI isn't going anywhere either
TheTon 4 hours ago|
But games are full fledged GUI apps. At a minimum they have a window.

It’s really unclear what it means to support old games but not old apps in general.

I would think the set of APIs used by the set of all existing Intel Mac games probably comes close to everything. Certainly nearly all of AppKit, OpenGL, and Metal 1 and 2, but also media stuff (audio, video), networking stuff, input stuff (IOHID etc).

So then why say only games when the minimum to support the games probably covers a lot of non games too?

I wonder if their plan is to artificially limit who can use the Intel slices of the system frameworks? Like hardcode a list of blessed and tested games? Or (horror) maybe their plan is to only support Rosetta for games that use Win32 — so they’re actually going to be closing the door on old native Mac games and only supporting Wine / Game Porting Toolkit?

dagmx 3 hours ago||
Games use a very small portion of the native frameworks. Most would be covered by Foundation, which they have to keep working for Swift anyway (Foundation is being rewritten in Swift) and just enough to present a window + handle inputs. D3DMetal and the other translation layers remove the need to keep Metal around.

That’s a much smaller target of things to keep running on Intel than the whole shebang that they need to right now to support Rosetta.

TheTon 3 hours ago||
I don’t agree. My point is their collective footprint in terms of the macOS API surface (at least as of 2019 or so) is pretty big. I’m not just speculating here, I work in this area so I have a pretty good idea of what is used.
rowanG077 10 minutes ago||
It will be interesting to see whether they keep optional TSO in their SoCs after Rosetta 2 is no longer working.
al_borland 16 hours ago||
Hopefully this means macOS 27 will be a Snow Leopard type release to focus on bug fixes, performance, and the overall experience, rather than focusing on new features.
egorfine 43 minutes ago||
No. Only Steve Jobs could have pulled this.

Modern day Apple cannot. A bugfix-only release is not going to sell anything.

lapcat 16 hours ago||
Why would it mean that?

It's a myth that Snow Leopard was a bug fix release. Mac OS X 10.6.0 was much buggier than 10.5.8, indeed brought several new severe bugs. However, Mac OS X 10.6 received two years of minor bug fix updates afterward, which eventually made it the OS that people reminiscence about now.

Apple's strict yearly schedule makes "another Snow Leopard" impossible. At this point, Apple has accumulated so much technical debt that they'd need much more than 2 years of minor bug fix updates.

https://lapcatsoftware.com/articles/2023/11/5.html

dagmx 3 hours ago||
Not sure why you’re downvoted because you’re right.

Snow leopard brought a huge amount of under the covers features. It was a massive release. The only reason it had that marketing was because they didn’t have a ton of user facing stuff to show

wtallis 2 hours ago||
That is more or less what users asking for another Snow Leopard want: a release that doesn't have gratuitous UI churn and superficial changes, doesn't break the end user's muscle memory, but instead focuses on deep-seated and long-standing issues under the hood. If the right thing for the OS in the long term is to replace an entire subsystem instead of applying more band-aid fixes, then take the time to do a proper job of it.

lapcat loves his straw man about OS X 10.6.0 having plenty of bugs, but that misses the point of Snow Leopard. Of course a release that makes changes as fundamental as re-writing the Finder and QuickTime to use the NeXT-derived frameworks rather than the classic Mac OS APIs, and moving most of the built-in apps to 64-bit, is going to introduce or uncover plenty of new bugs. But it fixed a bunch of stubborn bugs and architectural limitations, and the new bugs mostly got ironed out in a reasonable time frame. (Snow Leopard was probably one of the better examples of Apple practicing what they preach: cleaning out legacy code and modernizing the OS and bundled apps the way they usually want third-party developers to do to their own apps.)

Fixing architectural bugs is still fixing bugs—just at a deeper level than a rapid release schedule driven by marketable end-user features easily allows for.

matjazk 59 minutes ago|
Running Windows in Parallels. Even when running Windows ARM version, you still need Rosetta to run Windows x86 binaries.
More comments...