Modern macOS laptops have started cutting out part of the screen for the camera, similar to mobile phones. In windowed mode, this is not an issue as that area is occupied by the system menu bar and not our problem. But in full-screen mode, the ThinLinc client uses this area and part of the session is obscured by the camera area. The remote system is not aware of this and does not compensate for this, meaning that important parts of the screen will be hidden. macOS native full-screen mode solves this by shrinking the usable area to below the camera notch. We should be able to do something similar. It might get a bit extra tricky when multiple monitors are used, though.
I found the documentation for querying where the notch is: https://developer.apple.com/documentation/appkit/nsscreen/safeareainsets?language=objc Unfortunately, it requires macOS 12 which is beyond even what we plan to require. Hopefully, there is some way to dynamically find this method. It is an objective-c call, though, so it's not as simple as a dlsym() call.