Posted by victorbjorklund 8 hours ago
Both Windows and Linux (Wayland) support scaling the UI itself, and with their support for sub-pixel anti-aliasing (that macOS also lacks) this makes text look a lot more crisp.
Meanwhile in Linux the scaling is generally good, but occasionally I'll run into some UI element that doesn't scale properly, or some application that has a tiny mouse cursor.
And then Windows has serious problems with old apps - blurry as hell with a high DPI display.
Subpixel antialiasing isn't something I miss on macOS because it seems pointless at these resolutions [0]. And I don't think it would work with OLED anyway because the subpixels are arranged differently than a typical conventional LCD.
[0] I remember being excited by ClearType on Windows back in the day, and I did notice a difference. But there's no way I'd be able to discern it on a high DPI display; the conventional antialiasing macOS does is enough.
This would've been easily solved with non-integer scaling, if Apple had implemented that.
(I now use a combo of 4K TV 48" from ~1.5-2 metres back as well as a 4K 27" screen from 1 m away, depending on which room I want to work in. Angular resolution works out similarly (115 pixels per degree).)
Some indie Mac developers even started implementing support for it in anticipation of it being officially enabled. The code was present in 10.4 through 10.6 and possibly later, although not enabled by default. Apple gave up on the idea sadly and integer scaling is where we are.
Here’s a developer blog from 2006 playing with it:
> https://redsweater.com/blog/223/resolution-independent-fever
There was even documentation for getting ready to support resolution independence on Apple’s developer portal at one stage, but I sadly can’t find it today.
Here’s a news post from all the way back in 2004 discussing the in development feature in Mac OS tiger:
> https://forums.appleinsider.com/discussion/45544/mac-os-x-ti...
Lots of of folks (myself included!) in the Mac software world were really excited for it back then. It would have permitted you to scale the UI to totally arbitrary sizes while maintaining sharpness etc.
> (I now use a combo of 4K TV 48" from ~1.5-2 metres back as well as a 4K 27" screen from 1 m away, depending on which room I want to work in. Angular resolution works out similarly (115 pixels per degree).)
The TV is likely a healthier distance to keep your eyes focused on all day regardless, but were glasses not an option?
Once you get used to flicking in and out of zoom instead of leaning into the monitor it’s great.
As an aside, Windows and Linux share this property too nowadays. Using the screen magnifiers is equally pleasant on any of these OSes. I game on Linux these days and the magnifier there even works within games.
4K pixels is not enough at 27" for Retina scaling.
Apple uses 5K panels in their 27" displays for this reason.
There are several very good 27" 5K monitors on the market now around $700 to $800. Not as cheap as the 4K monitors but you have to pay for the pixel density.
There are also driver boards that let you convert 27" 5K iMacs into external monitors. I don't recommend this lightly because it's not an easy mod but it's within reason for the motivated Hacker News audience.
I use a Mac with a monitor with these specs (a Dell of some kind, I don't know the model number off the top of my head), at 150% scaling, and it's not blurry at all.
Sadly, it basically never happened. There was the LG display that came out a couple of years later. It didn't have great reviews, and it was like two thirds the cost of an entire 5k iMac.
It took Apple over 7 years to release their standalone 5k display, and there are a few other true 5k displays (1440p screen real estate with quadruple-resolution, not the ultrawide 2160p displays branded as "5k") on the market now with prices just starting to drop below 1,000 USD.
Unfortunately in that time I've gotten used to the screen real estate of the ultrawide 1440p monitors (which are now ubiquitous, and hitting ridiculous sub-$300 prices). As of now, my perfect display for office work (gaming, video/photo work, or heavy media playback are different topics) would be 21:9 with 1440p screen real estate with quadruple-resolution—essentially just a wider version of that original 5k iMac display.
Only thing that holds back that thought lately is, I'm suddenly spending more and more time in multi-pane terminals, and my screen real estate needs have dropped. The only two things I greatly miss now on my laptop is keyboard quality and general comfort (monitor height, etc).
A decade later, it boggles my mind that it's so hard to find a retina-class desktop monitor. The successor to the Cinema Display is basically an iMac, and priced like it. There have very recently been releases from ASUS and BenQ, but it still feels like an underserved niche, rather than standard expectation.
All that is to say: hard cosign.
Edge use case I know.
Why can't it be something simple?
For me though, I am frequently working in different rooms with arbitrary lighting situations. Net effect of the gloss is negative for me unquestionably.
I used to daily drive an apple thunderbolt display (the last non-retina one, 2560x1440). That thing was atrocious. I could often see the reflections of my glasses, or a white glare if I was wearing a white shirt. At nigh, in a dark office (lights off, just whatever came in from the street).
I'm typing this on a matte "ips black" dell ultrasharp something-or-other at 10% brightness, wearing glasses, a white t-shirt, with an overhead light, and see no reflection or glare on my screen. There's no way in hell I'd go back to a shiny screen.
I understand "anti-glare" technology has improved. The most recent apple screen I've tested is an m1 mbp. It seems somewhat better than my 2013 mbp, but still a worse experience than my 2015 (or thereabouts) 24"@4k dell, which is pretty old technology. My 2025 lenovo has a screen that's much more confortable to use inside.
Paradoxically, I'd say the one environment where I prefer my macs to my matte screens is in bright sunlight. Sure, there are more reflections than you can shake a stick at, but there's always an angle where you can see the part of the screen you want. You have to move around, which is obviously annoying, but you can see. The matte screens just turn to mush. Luckily for me, I hate being out in the sun, so I never encounter this situation in practice.
I think the "frost" you're talking about depends a lot on the screen implementation. I tested once an HP model, 27"@4k, and it did have such an effect. Anecdotally, it didn't handle reflections all that well, either. So maybe it's just a question of lower quality product?
OLED smartphones have much higher ppi to deal with this.
https://www.tomshardware.com/monitors/lg-display-reveals-wor...
Not anymore, as long as you make sure that any RGB antialiasing is turned off. Linux defaluts to disabling this and doing only grayscale antialiasing, so it looks great on an OLED out of the box. Windows can be configured to do this.
I presume you also mean "when used for text heavy work" here, yes? Or do you mean that these displays tire out your eyes even when used "for what they're for", i.e. gaming? (Because that's a very interesting assertion if so, and I'd like to go into depth about it.)
I have an ASUS ProArt Display 27” 5K. And I somewhat regret it.
I love the pixel density. But I don’t love the matte finish. Which is apparently a controversial take. But I really don’t. I like the crisp pop of typography you get with a glossy display. And, for UI design, the matte finish just doesn’t “feel” like the average end-user experience. I am constantly pushing Figma between my laptop display and my monitor to better simulate what a design will look like on an average glossy LCD or OLED display.
I got a deal on a used one last year and I love it. It's the only monitor I've used plugged into a MacBook that didn't look notably off (worse) compared to the MacBook's display sitting next to it. Only thing a bit jarring is it's 60Hz but I can live with it.
Asus has picked up the 5k 27" monitor from LG, it's the $730 PA27JCV
I constantly see people saying Apple displays are a terrible value. Last Apple display I had was the Thunderbolt 27 but from now on I'm sticking with Apple.
I've had nothing but issues with non-Apple monitors as well. Customer service ime is non-existent if you need a repair. For something I rely on to get work done, I'm starting to think the premium is worth it.
And somehow they completely forgot how to seamlessly work with displays in general. Connect multiple displays via Thunderbolt? Nope. Keep layouts when switching displays? No. Running any display at more than 60Hz? No. Remember monitor positions? No.
There are even 240Hz displays.
IIRC Apple couldn't get above 60Hz even on third-party displays they explicitly advertised.
Make sure your dock, dongle, and/or cables aren’t bottlenecks.
I've switched docks, dongles, cables, to no avail.
Support also varies a lot between M chips, and Thunderbolt often doesn't support high refresh rates https://support.apple.com/en-us/101571
I can't remember now the actual setup I had, sadly
Both of my LG ultrawides work at 144Hz?
At the time, people were "marveling" at the magic of Apple, and wondering how they did the math to make that display work within bandwidth constraints.
The simple answer was "by completely fucking with DP 1.4 DSC".
I had at the time a 2019 (cheesegrater) Mac Pro. I had two Asus 27" 4K HDR 144Hz monitors, that the Mac had no problems driving under Catalina.
Install Big Sur. Nope. With the monitors advertising DP 1.4, my options were SDR@95Hz, HDR@60Hz. I wasn't the only one, hundreds of people complaining, different monitors, cards, cables.
I could downgrade to Catalina: HDR@144Hz sprung back to life.
Hell, I could on the monitors tell them to advertise DP 1.2 support, which actually improved performance, and I think I got SDR@120Hz, HDR@95Hz (IIRC).
So you don't deserve downvotes on this. Apple absolutely ignored standards and broke functionality for third party screens just to get the Pro Display XDR (which, ironically, I own, although now it's being driven by an M2 Studio, versus the space heater that was the Xeon cheesegrater).
Here’s some monitors you can buy at that price point:
- 6k 32” monitor (similar PPI) (Acer PE320QX)
- most high-end 4k displays (even OLEDs) with 144hz+ refresh rate
32” 4k isn’t great PPI, but it’s still fine PPI, at a reasonable distance. Double the refresh rate is a much more noticeable improvement to me than 40% better pixel density, at a distance where retina matters a bit less than laptops & handhelds. And you can get that for less than half the cost
Plus, you can get it with multiple outputs & KVM to switch between MacBook & PC. And still run it off a single USB C cable.
Usually these exists only to bump the price of the pro model.
120 Hz can also noticeably improve frame pacing for 24p video*.
120 Hz vs 144 Hz? Barely noticeable when flipping between the two. Not sure if I'd pass an ABX test with 100% accuracy.
Can't speak for 240 Hz or higher, as I haven't used them.
* Though 119.88 Hz is probably a better default for this since most non-DCI "24p" video is still 23.976 FPS; this is changing, but until browsers and streaming apps support VRR for video, I'm not convinced this is a good thing due to the mountain of legacy 23.976 FPS content.
It's night and day when you're going back and forth between looking at them and wiggle your mouse around in circle. But after a few seconds of being focused on your work, you're not thinking about it anymore.
Being able to watch 24fps video without non-integer frame weirdness is the only real advantage outside of twitch-reaction gaming.
I personally wouldn’t buy a new LCD based display anymore at this price. There are flaws inherent to the technology that affect all of my recent Apple displays (Studio Display, M1 Pro iPad, M1 Pro MPB, M4 Pro MPB). After using OLED TVs and OLED iPhones for years, it’s very difficult to look past LCD’s issues (edge yellowing+dimming specifically affects all my Apple screens more than I am happy with).
There are no reviews/studies on long-term aging of Apple’s LCD displays, so all of this should be taken with a grain of salt, maybe my devices are just unlucky.
I don’t know if the Pro XDR line is better or how that would carry over to the Studio XDR. I haven’t seen many complains about the Pro XDR, but the Studio Display form factor has a different cooling design which would affect longevity.
I will say I can never go back from retina resolution text, and that alone has made the experience of Studio Display good. If we could get OLED it would be perfection. I think I would have to see the XDR in practice to be convinced, but 120hz requiring a whole new computer does make it a non-starter for me.
I bought my OLED TV when fearmongering was the highest, and it still works perfectly with zero burn-ins. So it is definitely possible. I bought the tv 8 years ago.
> A US Department of Energy paper shows that the expected lifespans of OLED lighting products goes down with increasing brightness, with an expected lifespan of 40,000 hours at 25% brightness, or 10,000 hours at 100% brightness
> Mac models with M1, M1 Pro, M1 Max, M1 Ultra, M2, and M3 support Studio Display XDR at up to 60Hz. All other Studio Display XDR features are supported.
Also the base M4 doesn’t habe Thunderbolt 5 and it support 120 hz.
Can you point me to said list? All I could find was:
> Mac models with M1, M1 Pro, M1 Max, M1 Ultra, M2, and M3 support Studio Display XDR at up to 60Hz. All other Studio Display XDR features are supported.
And The Verge reports:
> There’s also support for adaptive sync that can adjust between 47Hz and 120Hz (if it’s connected to an M4 Mac or later, or the M5 iPad Pro)
I got an M3 Max and was strongly considering upgrading my old monitor, but if I can't do 120hz, I'll just wait until I upgrade my laptop as well.
There’s no list per-se. The MacBook Pro (2021 and later) is listed as supported. The M3 Pro and M3 Max are not listed as only supporting 60Hz while the M3 and M1 Pro are.
Dell U4025QW: 5120 x 2160 = 11,059,200 vs Apple Studio Display XDR: 5120 x 2880 = 14,745,600
So your display has 25% less pixels.
I had no idea what it was for the longest time. As it turns out, macOS frequently enables it even when it’s unnecessary, and without any way to override.
(Notice how they listed the M1 chips individually.)
Also works great with other sources like an Xbox
I used a Pro Display XDR as my daily driver at work and the difference is minimal
Nano texture in mixed lighting scenarios is worth every penny even on a lower resolution and lower refresh rate panel.
Also the non-XDR is only a small upgrade otherwise, no 120Hz, no HDR, only Thunderbolt 5 and a new camera. Finally a downstream Thunderbolt port though.
This is all after 4 years?
[1] https://store.hermanmiller.com/home-desk-accessories/jarvis-...
- Design Patterns by the Gang of Four
- Modern C++ Design by Andrei Alexandrescu
- Code Complete from the Microsoft Press
That's enough old paper to raise the display height to a comfortable level.
The camera is still 12MP but offers Desk View. Maybe this is a feature unlocked by the improved onboard A-series chip (A19?).
I wouldn't sniff too hard about Thunderbolt 5. Thunderbolt 5 doubles throughput to 80 Gbps from 40.
Would have loved refresh above 60Hz but then who's gonna get the XDR?
just buy a nice one on amazon for $100, it's still VESA mounts
Sure, most of the time the cable seems secure enough to maintain connection when I accidentally nudge the laptop. But every once in a while, when I slightly shift the laptop here or there, flicker and everything goes batshit. The monitor loses connection, so maybe (depending on config) the laptop screen changes resolution and then eventually reconnects and flickers and changes back. Or the network drops out (if I'm connected to Ethernet over Thunderbolt). Or a program freaks out because the drive it was using disappeared. Or the laptop really freaks out and kernel panics.
Like I said, it doesn't happen a ton, but it's happened a handful of times over the years, just enough that now I always use an external mouse and keyboard with a docked laptop to avoid such nonsense.
- 5k resolution at HIDPI (27inch)
- 120hz refresh rate
- TB5 and single cable connectivity.
There are a couple of other HIDPI displays at 5k with 120hz refresh rate but they don't do TB5.
But even so, these 2 new monitors still don’t support multiple inputs.
It looks like a nice display, but that’s a deal killer for me.
I guess we're going to see how the support for DP Alt-Mode will be, as I'm not sure how much bandwidth that can provide, so 120Hz might be out of the question. But for now that has been a simple way to get around the lack of multiple display inputs, you just needed a separate KVM switch for it.