UIScreen scale property differs between device and simulator and debugger

Originator:Panajev
Number:rdar://30030701 Date Originated:14/01/2017
Status:Open Resolved:
Product:Developer Tools Product Version:Xcode 8.2.1
Classification:Serious Bug Reproducible:Always
 
I am not sure if this started with the iOS 10.2 Simulator included in Xcode 8.2, but what I am noticing is a clear difference between the behaviour of a real device iPhone 7 Plus running iOS 10.2 and the Simulator version of the iPhone 7 Plus running iOS 10.2. So, same exact phone... just one is a Simulator and the other one a physical device .

The application in question does not have @3x assets or launch assets for the 4.7'' or 5.5'' screens so it is currently meant to essentially display a scaled version of the iPhone 5 (although we are working to address this soon).

Let's take the following code snippet for a CAGradientLayer

gradientMask.rasterizationScale = [UIScreen mainScreen].scale;
gradientMask.startPoint = CGPointMake(0, CGRectGetMidY(self.frame));

and let's put a breakpoint in the second line. Let's compile in debug mode (-O0 -g).

* On the iOS Simulator specified above (same thing for the iOS 9.3 version of the simulator) the following happens:
(lldb) po gradientMask.rasterizationScale
--> 3
(lldb) po [UIScreen mainScreen].scale
--> 3

* On the iPhone 7 Plus device specified above the following happens:
(lldb) po gradientMask.rasterizationScale
--> 2
(lldb) po [UIScreen mainScreen].scale
--> 3

**Observations**
 * On the iOS Device: The debugger is not printing out the same value when breaking in step 2 as the value just stored by the same command in step one of the original code snippet.
      ** The iOS Device behaves as expected, but the debugger would give you incorrect data.
          *** Actual used screen scale: 2.
          *** Debugger screen scale result: 3.
    
 * On the iOS Simulator: The debugger is printing out the same value when breaking in step 2 as the value just stored by the same command in step one of the original code snippet.
      ** The iOS Simulator does not behave as expected.
          *** Actual used screen scale: 3.
          *** Debugger screen scale result: 3.

Steps to Reproduce:
1. Launch Xcode 8.2.1.
2. Compile and run the above code inside our app on both Simulator and device (same OS, same device type as mentioned in the description).

Let's take the following code snippet for a CAGradientLayer

gradientMask.rasterizationScale = [UIScreen mainScreen].scale;
gradientMask.startPoint = CGPointMake(0, CGRectGetMidY(self.frame));

and let's put a breakpoint in the second line. Let's compile in debug mode (-O0 -g). Let's print out the values like so:

(lldb) po gradientMask.rasterizationScale
(lldb) po [UIScreen mainScreen].scale

when the breakpoint we setup hits.

Expected Results:
(lldb) po gradientMask.rasterizationScale
2

(lldb) po [UIScreen mainScreen].scale
2

on both Simulator and Device.

Actual Results:
* On the iOS Simulator specified above (same thing for the iOS 9.3 version of the simulator) the following happens:
(lldb) po gradientMask.rasterizationScale
--> 3
(lldb) po [UIScreen mainScreen].scale
--> 3

* On the iPhone 7 Plus device specified above the following happens:
(lldb) po gradientMask.rasterizationScale
--> 2
(lldb) po [UIScreen mainScreen].scale
--> 3

Version:
Xcode 8.2.1 (8C1002), macOS 10.2.2 (16C67)

Comments


Please note: Reports posted here will not necessarily be seen by Apple. All problems should be submitted at bugreport.apple.com before they are posted here. Please only post information for Radars that you have filed yourself, and please do not include Apple confidential information in your posts. Thank you!