Posted by speckx 12 hours ago
My computer was running so slowly that I had to minimize transparency in system preferences somewhere. I think I also turned off opening every app in its own space. And I hid the icons on the Desktop in Finder settings somehow, which helped a lot. There are countless other little tweaks that are worth investigating.
I also highly recommend App Tamer (no affiliation). It lets you jail background apps at 10% cpu or whatever. It won't help with WindowServer or kernel_task (which also often runs at 100+% cpu), but it's something.
I can't help but feel that there's nobody at the wheel at Apple anymore. When I have to wait multiple seconds to open a window, to switch between apps, to go to my Applications folder, then something is terribly wrong. Computers have been running thousands of times slower than they should be for decades, but now it's reaching the point where daily work is becoming difficult.
I'm cautiously optimistic that AI will let us build full operating systems using other OSs as working examples. Then we can finally boot up with better alternatives that force Apple/Microsoft/Google to try again. I could see Finder or File Explorer alternatives replacing the native ones.
That's because some app is spamming window updates.
It's been an ongoing problem for many releases. AFAICT, WindowServer 100% CPU is a symptom, not a cause.
FWIU there's really no backpressure mechanism for apps delegating compositing (via CoreAnimation / CALayers) to WindowServer which is the real problem IMO.
I've been hearing this complaint for decades and I'll never understand it. The suggestion seems completely at odds with my own experience. Regardless of OS, they all seem extremely fast, and feel faster and faster as time goes on.
I remember a time when I could visually see the screen repaint after minimizing a window, or waiting 3 minutes for the OS to boot, or waiting 30 minutes to install a 600mb video game from local media. My m2 air with 16gb of memory only has to reboot for updates, I haphazardly open 100 browser tabs, run spotify, slack, an IDE, build whatever project I'm working on, and the machine occasionally gets warm. Everything works fine, I never have performance issues. My linux machines, gaming pc, and phone feel just as snappy. It feels to me that we are living in a golden age of computer performance.
10% of the time, Windowserver takes off and spends 150% CPU. Or I develop keystroke lag. Or I can't get a terminal open because Time Machine has the backup volume in the half mounted state.
It's thousands of times faster than the Ultra 1 that was once on my desk. And I can certainly do workloads that fundamentally take thousands of times more cycles. But I usually spend a greater proportion of this machine's speed on the UI and responsiveness doesn't always win over 30 years ago.
Even if that would be possible, you can't run commercial software. And for many people, the software they run is more important than the OS.
https://news.ycombinator.com/item?id=47282085#47310011
Probably my least favorite redesign in the whole update. Why is everything an oval? It's just bizarre.
If the biggest flaw of a OS is the border radius of its windows, you've got yourself a pretty decent OS!
It's not gonna make me leave my darling Linux, ofc, but i think this whole debacle can only be interpreted as praise.
On second thought, it might also be considered a mediation on people's tendency to bike-shed.
Or to stay it another way, if we see shit like this then we know the whole thing is a hack.
I never said that
As a related anecdote, my friend said my car was ugly. I asked him what cars he thought looked good. He said “I don’t like cars”. As a result I realized his opinion was worthless
For example, there is not much you could do to Finder to make it worse.
This argument would also make Windows 11 a pretty decent OS by extension via "If the biggest flaw of a OS is the position of the start menu you've got yourself a pretty decent OS".
In general I could use any minor nuisance as a proof of decency - or inject some to form this argument on purpose as a manufacturer.
People don't like if their environment changes in minor unsolicited ways. There's always gonna be fuzz about these things and that means that the fuzz itself can't be used to make any strong argument whatsoever.
That’s way more than just the “position of the start menu”
As someone who works on Windows, Mac, and Linux; Windows stands alone in my opinion as the "stepping on legos with no socks on" of operating systems.
There are loads of other flaws with the OS. It just so happens that people care a lot about the design of Apple's products, so people talk about these details.
MacOS has been shit for as long as I've used it (8 years) and probably for much longer than that. There are many lists available of MacOS problems (https://old.reddit.com/r/MacOS/comments/12rw1sn/a_long_list_... for example), it's just that there's not much point making a new article about the Finder that's been shit, and unchanged, for a decade.
And the updates to Music (formerly iTunes) are so bad the entire team should be dressed down, Steve Jobs style.
If you use SIP and use package managers (npm, cargo, pip, etc) outside of a VM you are substantially more vulnerable to attack than someone who doesn't use SIP and doesn't use package managers.
So if you want to fix your corners, you can do it guilt-free by adopting some better security practices around the malware delivery systems / package managers that you have installed on your computer.
Do you have a system in mind that prevents the user from doing this?
Sure, macOS could adopt an iPad-style security system that refuses to run all software outside the App Store. It works on iPhone and iPad just fine, all the prosumers love it.
It's not like native darwin triples are a popular compilation target. There wouldn't be any vast tragedy if the macOS shellutil authors were told to use zsh in a VM instead, it would separate the parts of macOS that Apple cares about from the parts they don't seriously support. WSL and Crostini achieves this on vastly weaker hardware with great results.
SIP guarantees that you will be able to turn on your computer in safe mode and remove the malware, whereas without it your OS is toast.
There are apps that they need to run in the background, sure. They have a spot in the menubar.
Oh no I forgot, you can only have 5 of them. Not 6. Why? Because FU. Go buy a third party app (bartender) that records your entire screen to do basic app management that the OS should do.
I hate MacOS.
My neurodivergency makes me feel actual distress over those corners. I am not being dramatic. It sucks.
Does anyone actually do this? Especially for heavy-duty applications like my web browser and IDE, this has always felt like a bizarre assumption to me.
IMO, this has been their assumption for years, and it actually turned me off when I tried getting used to Mac circa 2006-2007. Coming from Windows at the time, I just couldn't get over a weird anxiety that my application window wasn't maximized, because it didn't look like it completely snapped into the screen corners.
Now, using 34-inch ultrawide monitors almost exclusively, I never maximize anything... it'd be unusable.
Browsers only ever get maximized to the left/right half screen for me too
Which is something macos should really improve on though, the ux is pretty bad compared to Windows and Linux there
Obnoxiously, it's part of the recent trend of overloading the Globe/Fn key, so it's hard to do with third-party keyboards.
Hover over the green button in the top left of the window. I recently found out about that menu for moving a window between screens, which is also an option it has. (I also just found them in the Window menu if you prefer that. I dont; the options take an extra level of hovering to get to.)
Fuck Tim Apple! Seriously, fuck him hard in the arse! Why would they hide this. I've been dealing with this shit for two decades and I find out via a HN comment that Apple was hiding this feature behind a green dot‽‽‽‽
This is the biggest reason I love Linux. I can choose my own desktop, or even forsake the desktop entirely for a simpler window manager, without changing operating systems. Some are hyper focused on a tailored experience (gnome) while others let you configure to your heart's content (kde).
There's sacrifices to be made, of course, but not having to live under the oppression of Apple's beneficiary dictator designers is absolutely worth it for me.
I maximize windows of graphics and video editors.
All the rest I'd prefer to just summon as-needed and then dismiss without navigating away from the windows I care about.
sway/niri want me to tile every window into some top-level spot.
Took me a while to admit it, but the usual Windows/macOS/DE "stacking" method is what I want + a few hotkeys to arrange the few windows I care about.
It sounds like the scratchpad may be especially close to what you want.
[1]: https://github.com/esjeon/krohnkite [2]: https://github.com/paulmcauley/klassy
Apple then made things go full screen, but in a special full screen mode, so macOS worked more like the iPad.
By the time they added a way to maximize windows in the way Windows does, the idea of maximizing an app has largely worked its way out of my workflow. It was always too much trouble, and I find very few apps where it provides much benefit. Web browsers, for example, often end up with a lot of useless whitespace on the sides of the page, so they work better as a smaller window on a widescreen display. In an IDE, it really depends on what’s being worked on and if text wrapping is something I want. Ideally lines wouldn’t get so long that this is a problem.
With the way macOS manages windows, I often find it easiest to have my windows mostly overlapped with various corners poking out, so I can move between app windows in one click. The alternative is bringing every window of an app to the front (with the Dock or cmd+tab), or using Mission Control for everything, neither of which feels efficient.
I could install some 3rd party window management utility, I suppose, but in the long run, it felt easier to just figure out a workflow that works on the stock OS, so I can use any system without going through a setup process to customize everything. It’s the same reason I never seriously got into alternative keyboard layouts.
Full screen one. Switch to the other. Now, use just cmd-tab and cmd-` to get to the full screen safari window (cmd-` switches between windows in the same application, which is literally never the right thing, but I digress).
For what it's worth, the third party tool 'altTab' mostly fixes this.
Bonus MacOS UI bug: I had to exit altTab to confirm they still hadn't fixed cmd-`. When I re-opened it using cmd-space, finder defaulted to the version in ~/Downloads instead of /Applications, then read me the riot act about untrusted software trying to change accessibility settings.
One more thing: I'm still not using MacOS 26, so all my complaints are about the "last known good" release.
Except Safari, which just fills out the window's height vertically. Kinda weird to make an exception like that but I don't hate it, because I generally use Safari for reading, and shrinking the browser's width forces lines of text to not get too long if the website's styling isn't setting that manually.
When I use the Window menu, Zoom replicates what double-clicking the top title bar does, while Fill maximizes the window. This holds true with the behavior you describe in Safari as well.
It just seems like a lot of apps treat Zoom and Fill the same now (I tried Calendar, Notes, TextEdit, and NetNewsWire), which adds to the confusion.
After I got used to working in windows instead of full screen all the time, I can't really go back. Even on Windows I find myself working the way I do on macOS. Full screening every app made more sense on a 1024x768 screen (or smaller). Once I moved to a widescreen display (which happened to coincide with getting my first mac) running full screen felt like the wrong move most of time.
Web pages would look something like this:
| <- whitespace -> | <- content -> | <- whitespace -> |
| | Lorem ipsum | |
| | dolor sit amet, | |
| | consectetur | |
| | adipiscing | |
| | elit. Morbi | |
| | convallis ante | |
Making the window smaller meant less wasted space and less blinding white space. Once I got used to that idea, it carried over to most other apps.I am currently running a 16" display at a similar fractional scaled resolution (because Apple stopped understanding DPI after shipping the first LaserWriter, apparently).
Over that time, my eyes have not gotten better to match display DPI, so I'd rather have web sites just adjust the font size so that there are a reasonable number of words per line instead of rendering whitespace.
Non-full-screen windows would make more sense if Apple supported tiling properly, like most Linux WMs and also modern Windows.
MacOS sort of supports tiling in a "program manager shipped it + got promoted" sort of way, but you have to hover over the window manager buttons, which is slower than just manually arranging stuff. If there are any keyboard shortcuts to invoke tiling, or a way to change the WM buttons to not suck, I have not found them.
As for tiling in macOS...
You can use the mouse to drag windows into tiled positions. Grab a window and when your cursor hits the side, corner, or top edge of the screen, it will indicate the tiling position, much like AeroSnap on Windows from some years back. You can also hold the Option key while holding the window to get the tiling regions to show up without moving all the way to the edge.
Keyboard shortcuts exist as well. Go to Settings -> Keyboard -> Keyboard Shortcuts... In the dialog that opens, go to Windows. There you can see all the options and customize them if you'd like. Or set shortcuts for things that might not have one yet.
If for some reason dragging the windows around doesn't work, go to Settings -> Desktop & Dock -> the Windows heading. There are toggles to enable or disable dragging to tile, and the Option key trick. You can also turn off the margins on tiled Windows, which you'd probably want to do.
I've never been a big fan of window tiling myself. There was a time when I needed a lot of different windows visible at all times, but that hasn't been the case in a long time. I find tiling makes things too big or small, it's never what I actually want. I drag the window up to the top of the screen to invoke Fill from time to time, but that's about it.
That was the Mac in the 1990s. It was designed for, and highly usable with, a one-button mouse. It didn't have hidden context menus or obscure keyboard shortcuts. Everything was visible in the menu bar and discoverable. The Finder was spatially aware with a high degree of persistence that allowed you to develop muscle memory for where icons would appear onscreen every time you opened a folder.
There was almost nothing hidden or lurking in the background, unlike today (my modern Mac system has 500 running processes right now, despite having only 15 applications open). We've had decades of feature creep since the classic Mac OS, which has made modern Macs extremely hard to use (relatively speaking).
Why is it that some of the most useful features in Apple products are impossible to find on your own? I recently also learned about "three finger swipe to undo" in iOS instead of shaking the damn thing like it owes me money.
It works well for me, makes it easy to get two things side by side without wasting space.
Full Screen Mode was their answer to maximize, going back many years now (10.7).
Obviously all of that works better if Finder windows don't usually fill the screen, but it's not a hard requirement.
(IMO the spacial Finder was designed around floppies and small folders and didn't work so well with hierarchical folder views, so no big loss...)
I’ve never found a setup with multiple desktops or similar with a way to quickly switch between apps I’m using more than “editor slightly more left, browser slightly more right, …” and just clicking on a border I know brings that app to the front. I’m sure many think I’m crazy. That’s ok. :)
That said, I generally hate the new OSX UI. Every UI element that is non usable just became larger and wastes space I should be able to utilize. Likewise, it made some operations insanely frustrating (here’s looking at you, corner drag resize!).
I haven’t maximized a window in years. They look ridiculous like that. Especially web pages with their max width set so the content is 1/4 the screen and 3/4 whitespace.
If I ever accidentally full screen a window, and it’s not in night mode, I am instantly blinded by a wall of mostly white empty background!
I frequently use macOS on a projector, it doesn't quite fill my wall floor to ceiling but it comes close. I don't use full screen often, but I do it occasionally as a focusing strategy, and it's fine.
You're shining a bright light on a wall, which you are looking at.
With a monitor you are shining a bright light at your face, while staring directly at the lightbulb!
If you're using a monitor in the dark the way you use a projector, you should turn the backlight down. If you're using it in a well lit room, the brighter backlight should have less of an effect.
It sounds to me you've never actually looked at a monitor display large swaths of white before, it's brighter than light hitting a wall for sure, even with the brightness down, extra so when the ambient lightning is dark too.
The fact that it's bright outside when the sun is up might help, but it's nowhere near enough to compensate!
It’s probably a me problem, but I’m going to open stuff and then leave it scattered around all day. It’s fine.
I don’t use more than a couple of virtual desktops either. Just one for current tasks and one for background apps.
My actual biggest pet peeve with this setup is the vast number of web sites that deliberately choose to limit their content to a tiny column centered horizontally in my browser, with 10cm of wasted whitespace on each side.
The assumption is that the window should be the size of the content of the document inside.
It turns out that this approach works well for many applications, especially what the mac was designed for in the 80s and 90s. And it's horrid for modern "pro" applications.
I sometimes maximize something - other than video calls: those are always full-size - on the laptop screen, but otherwise not at all.
I can see how a full-screen IDE makes sense, but I don't use one, so I always want a couple of terminal sessions running alongside my editor.
There are vanishingly few contexts in which I find full-screen helpful. Not criticizing anyone else, or recommending my way of working, but it's what works for me.
[0] I would like better support for desktop management: naming and shortcutting, particularly. Years ago I tried some (I think it was Alfred, or a predecessor) add-on that promised that, but it was super flaky. Does anything exist that works well?
It’s so ingrained I tend to get frustrated on other desktops, which are nearly all built around the Windows mentality of keeping displays filled to the brim with tiled or maximized windows.
Even on the handful of times with maximize/tile on macOS, it’s with a gap of a few pixels of desktop peeking through so it doesn’t feel as “boxed in” and claustrophobic.
I think there's a conflict between the users who use it on studio displays and users who use it on 13 inch laptops. The Mac team at apple won't pick a side or come up with two solutions.
That's not completely true, they've been pushing swipe between fullscreen apps for a while.
But that doesn't make any sense on an iMac.
So the recommendation from pro users is to use Alfred to manage windows.
The other day I was explaining to one that their designs fixed width looks silly once it got up towards 4k resolutions. But the designers main concern was if people actually used full screen browsers on 4k monitors and if there was any point in thinking about the design at that resolution.
There are plenty of times I enjoy have 2 browsers side by side of even 4 browsers in a square, and being able to do that is one of the benefits of having a 4k monitor. But without a doubt the majority of my time is spent with a full size browser window open, and I observe the same from all the other windows/linux users I manage that use a 4k monitor.
In service of keeping this post simple, I've ignore system display/ui scaling. But still... the question/assumption from the mac designer completely blew my mind.
1. On a screen share support call with a mac user
2. Asked them to pull up a webpage
3. They pull up a super tiny ass browser window to the point I can't really see anything
4. I ask them to full screen the browser so we can actually read shit
5. The mac user just straight up panics or acts like like I've spoken an alien language to them.
The same process happens when I need a mac user to get to an apps settings that on a windows/linux computer would normally be under something like File > Preferences/Settings. They have no idea what I'm talking about or know just barely enough to know they don't remember how to do it and panic.Then I have to go google it and remember that CMD+comma(⌘+,) exists and reveal it to the mac user like it's actual black magic. And then I immediately forget about it until 6 months later when I need to support a mac user again and I repeat the whole cycle again.
I can’t tell if this is a serious comment or humor.
eta: i'm just saying if i had a glowing half drank beer or partially eaten pizza on my laptop in a business meeting i am getting weird looks. Just because you all normalized glowing fruit doesn't mean the rest of us take you seriously.
However, after the internship I went right back to fullscreen/window tiling in linux, so I can't say I really preferred it. Even now as a Gnome user with a big monitor and magic trackpad on my desk - which gives me ~equal access to either approach - I fullscreen everything.
Another component is how ability to overlap windows is emphasized, allowing the currently relevant portion of them to be visible without taking center stage or stealing any space from your main window(s).
Both are part of a larger difference in mentality and workflow style.
I do have Rectangle installed, so apps generally get at most the left or right half of the screen, with a shortcut for badly behaved websites that need 2/3 to look right. Apps are usually pretty good about remembering window positions, so mostly you futz with it once and you're done.
But for other apps where interactions tend to be brief like Finder, Messages, Notes, Music, etc - yeah I don't usually expand them to full screen.
On large external monitors, I think it makes total sense not to have every window maximized, though. Probably less usable that way.
Also just want to be 100% clear: Tahoe is bad and I hate the changes and I don't think the OS should prefer one way of working over the other. I just hope it's helpful to explain my perspective.
I have a 39" ultrawide and I keep every window maximized. I have OCD about this. I can't stand things all layered on top of each other. I like to focus on one screen at a time.
Chromium browsers have been rolling out split tabs and I use that on a couple of tasks where I'm constantly cutting/pasting between sites, but that's about it.
That said, I am a huge fan of manual window management.
Somewhat relatedly, we use Windows at work, and it drives me crazy when I hop on a computer after someone's been using it and they have every single thing maximized, even Windows Explorer, on 27" monitors. A maximized browser, I get... I don't do it myself but I understand how it can be useful, but maximizing Windows Explorer is just insane to me, and yet a lot of my coworkers do it.
A lot of stupid things about Mac window management stems from the mistake of forcing all applications to share a single menu that's glued to the top of the screen. This essentially turns your entire desktop into ONE application's window, within which its actual windows float around.
Historically this led to the Mac's penchant for apps that spawned an irritating flotilla of windows that you had to herd around your screen. Not only did this deny users a way to minimize the whole app at once, but it also sucked because you could see everything on your desktop (or other apps' UIs) THROUGH the UI of the application you were trying to use. A dysfunctional mess.
Around 15 years ago, I estimate, the huge advantage of a single application window finally permeated the Apple mindset and things have gotten much better in that regard. But Apple should have abandoned the single menu in the transition to OS X, and put menus where they belong: on applications' main frames. That would make the desktop a truly unlimited workspace and eliminate the daily irritation of the menu changes its contents behind the user's back because he clicked on another application's window (perhaps to move it).
Multiple times a day I minimize an application and then attempt to do something in the application that's now filling the screen... only to find that the menu still belongs to the application that isn't even shown. It's just so dumb.
But then... this is the GUI that, for decades, would only let you resize windows from ONE corner and NO edges. Apple grudgingly, half-assedly, and unreliably addressed that in the 2000s, only now to make it even less reliable in the shambolic Tahoe UI.
But do use apps fullscreen when Im traveling. The laptop screen is too small to use chrome or vscode any other way.
In general my browser is dead center or slightly to the right so I can access my other windows (terminal, throw away text editor, etc) easily where command tab is insufficient (when I have multiple terminal windows, eg)
Strange, I constantly get annoyed by how slow and unresponsive the Mac's tiling is when dragging windows to the edge. At the top it has at least half a second delay for no reason. But at least the newest version now has caught up with Windows 7.
Yes (but not for a browser). My terminal windows are 80x24, pretty much always. I do this today on Linux, I've done it through multiple versions of Windows, and I did it in my childhood on a 9" B&W "luggable" Mac screen.
I just like it, okay?
As you said, browser and IDE are the big exceptions, plus things like Lightroom or my 3d printer's slicer.
Even VS Code usually lives as a smaller window when I'm using more a text editor rather than as an IDE.
I have been using it for years and I just gave up entirely on managing anything and if I zoom out to see all my windows it looks like the freaking Milky Way from windows I forgot
so in response I changed my windowing strategy to having a set of windows floating around at exactly the size I want them, and then the advantage of the enormous screen is just how many windows I can have open at once
that being said, I use KDE not MacOS, and 90% of Mac users I'd guess are on laptops, so using this strategy sounds completely insane to me. On laptops I still default to fullscreening or "half-screening" most apps.
I would in fact say that the culture of not maximizing windows was a small reason why I switched to Mac OS X in the early 2000s.
Or just use the taskbar, which is literally made for switching between windows. Or it was, before Microsoft forgot its purpose.
Meanwhile, I want to use my graphical, mutli-window preemptive multitasking operating system to, you know, use multiple applications at the same time.
Trying to maximize a window, even 23 years ago when I first moved to OS X, was a completely manual process. It was designed around windows, not walls. And screens were much smaller and lower res back then.
This goes towards something that I've felt for a little while: at some point in time around the early 2000s, operating system vendors abdicated their responsibility to innovate on interaction metaphors.
What I mean is, things like tabbed interfaces got popularized by Web browsers, not operating systems. Google Chrome and Firefox had to go out of their way to render tabs; there was no support built into the OS.
The OS interfaces we have now are not appreciably different from what we had in the early 2000s. It seems absurd that there has been almost no progress in the last 25 years. What change there has been feels like it could have been accomplished in user-space, plus it doesn't get applied consistently across applications, thus making it feel like not a core part of the OS.
MacOS in particular was supposed to an emphasis on the desktop environment being the space of window and document level manipulation, as exemplified by the fact that applications did not have their own menubars. All application menu bars were integrated together at the top of the screen. Why should it be any different with any other UI organizational feature? Should not apps merely be a single window pane, accomplishing a single thing, and you combine multiple apps together to get something akin to an IDE out of them?
Well, I don't know if they should be. But they can't. Because OS vendors never provided a good means to do it. Even after signalling they wanted it.
Look at how older versions of Word, Excel, and Visual Studio worked. The tool trays stay consistant as you move between document windows. The entire application is minimizable and quittable together as one.
Photoshop still uses this metaphor. In the ealry and mid-2000s, Photoshop on Windows had a window for the application separate from the documents, but on Apple OS9 and OSX, the only representation of the application itself was in the menu bar. Document windows and tooltray windows both floated in the same desktop space as every other window.
I haven't checked on the GNU Image Manipulation Program, but I seem to remember it retained the same "no application window, tooltrays and doc windows exist in the DE" metaphor for much longer than Photoshop.
There is also a difference in the way that Chrome renders tabs in the window title area. That's a part of the UI chrome that one would expect to be in the perview of the UI toolkit, but Google took it on themselves.
https://en.wikipedia.org/wiki/Tab_(interface)
Don Hopkins himself can enlighten us about it (NeWS) better than me literally anyone in this thread, jut wait.
I suppose you could splurge for a Mac desktop and then get the cheapest, smallest screen possible, but I hope it’s rare.
Any space not used for the task I'm focused on is wasted. For me the actual problem is that switching apps/windows is too slow because of UI animations.
I'd like to be able to snap things to the middle third, especially on the ultrawides.
Only little calculator widgets, property panels, and modal dialogs that get immediately closed after use don't get maximized or at least docked to fill some region. I hate the cluttered, layered feeling of having a bunch of non-full-screen windows overlapping, I want to have a dozen apps open and making optimal use of the available display area.
Lots of native applications also pop up multiple windows with the expectation that they kind of just float around. But at least in Mac you can scroll on an app that isn’t in focus…
Now they sell expensive but nice hardware and they have mediocre software.
It seems you can only choose 2 out of three, nice hardware, nice software, good price. Apple is always choosing high price, and they either gave customers nice hardware or nice software, but not both.