Quote: "GetDeviceWidth has to return 1280 in this case as it represents the number of pixels being drawn"
It wasn't doing that until recently. It was matching the value set via SetWindowSize
Quote: "That sounds like it could work. @Qube_ would this work for you?"
Adding a new command wouldn't really work out cross platform compliant because :
I use GetMaxDeviceWidth / Height GetDeviceWidth / Height to centre a Window in the middle of the desktop. I also use those commands to scale up a window.
So having new commands really wouldn't help here as :
GetMaxDeviceWidth / Height returns the retina size
GetDeviceWidth / Height returns retina size
SetWindowSize is set via none retina but internally draws the window at retina resolution.
NEW GetWindowWidth / Height couldn't be used to centre a window in the middle of the desktop without doing a * 2 which then wouldn't work on Windows, Linux or anything none retina.
Every one of those commands should return none retina sizes ( just like the SetWindowSize requires ). That way code that works on a Mac also works in Windows, Linux and everything else. MacOS itself doesn't refer to resolutions of 5120 x 2880, it just refers to it as 2560 x 1440 ( but it's HiDPI ).
What we have now on retina is every singe drawing command using none retina, SetWindowSize using none retina but GetDeviceWidth / Height ( and Max versions ) commands returning retina. It doesn't make sense.
As mentioned, this was all returning none retina results in previous versions and has been working for many many months.