Replies: 3 comments 2 replies
-
What would help would be to re-expose the Snow Leopard option.. then we could easily do 2x & fractional as it supported that before & appears to still (at a gpu cost). I would think it’d be cleaner to figure out the undocumented API than dealing w/ EDIDs. Also so far I’m not seeing an improvement w/ sharpness on remote software.. I can see the native res of the remote client side clearly but then selecting the default 2x scale (1440x900) on my retina MacBook it blurs it all because Nuords & jump desktop connect rescaled to save bandwidth I guess instead of thinking maybe I need those pixels.
|
Beta Was this translation helpful? Give feedback.
-
Hi, macOS initially had some half-baked support for resolution independence (AppleDisplayScaleFactor) much like Windows does today. When Apple introduced retina displays, they ditched thisfor the sake of a fixed 2x resolution rendering. Current macOS versions either render the display at 1x or 2x, anything in between is a bitmap based scaling of the 2x resolution - these will never be as sharp as exact 2x scaling. When Apple did this they might have expected that mainstream high res displays will have slightly higher PPI - but this did not happen (the common 27" or 32" 4K screens have a PPI that requires some in-between scaling - they are not as sharp as a result but still somewhat OK). This is one reason why Apple needs to use special, rather costly panels for its own products (like 4.5K for 24", 5K for 27" and 6K for 32" - these look good with optimal UI size at 2x scaling). |
Beta Was this translation helpful? Give feedback.
-
While apple does indeed use perfect 2x scaling & panels that are exactly 2x the resolution of their target perceptual resolution they still fractionally scale on top of the 2x scaling - which is really more of what I’m asking as I’m sure it’s still just an undocumented call. I assume their “fractional” scaling is always doing the 2x scaling initially before rescaling it again - at least in any scaled resolution that is not considered “default” & is why it could cause some CPU or GPU usage. This behind the scenes work is very apparent when taking a screenshot & even more so when a Retina display is working w/ a non retina screen. Taking a screenshot of them both while being perfectly scale is what led me to writing betterScale awhile back because I better understood how Linux distros can do the exact same thing w/ their own 2x scaler plus xrandr for fractional. It’s literally no different. |
Beta Was this translation helpful? Give feedback.
-
I am trying to couple this software with the use of things like Nuords RDP and Jump Desktop Fluid protocol, but so far on HiDPI/Retina monitors the result is something that is still blurry. Is there any way we can enable the scaling and allow users to better control which scale factor they are working with?
Thanks! This is a wonderful app by the way.
Beta Was this translation helpful? Give feedback.
All reactions