Allow better keyboard states detection

Number:rdar://24196184 Date Originated:14-Jan-2016 11:55 PM
Status:Open Resolved:
Product:iOS Product Version:9.2
Classification:Enhancement Reproducible:Always
This is a duplicate of rdar://24070521

Modern applications, such as instant messaging and social networking apps, are heavily dependent of interacting with the system keyboard. Users have high expectations regarding being able to interact with the keyboard in a seamless way they generally do with built-in applications such as Messages and Notes. It is hard to offer basic features related to the keyboard when a handful of APIs are remained private to third-party developers.

The UIKeyboard APIs defined in UITextInputTraits and UIWindow haven't changed much since iOS 5. The UIKeyboard notifications' payloads are limited and hard to work with when more advanced logic is required. For example, on the iPad, detecting if the keyboard is anchored, or split, or simply not visible because an external keyboard is in use, it is barely impossible. There are a few fragile ways of assuming these states, based on the keyboard size and origin, but the state detection cannot be guaranteed and therefore, prone to bugs with any race condition or new OS version being released.

To help guaranteeing keyboard UI modes such as those, I propose the following enumeration to be added to the UITextInputTraits informal protocol:

typedef NS_ENUM(NSInteger, UIKeyboardMode) {
    UIKeyboardModeDocked,                 // Default mode of the keyboard, fully visible
    UIKeyboardModeUnDocked,               // When the keyboard is detached from the bottom of the screen (iPad only)
    UIKeyboardModeSplit,                  // When the keyboard layout is split in half (iPad only)
    UIKeyboardModeHidden,                 // When an external keyboard is detected, therefore the virtual keyboard is hidden from screen

A companion API to the previous proposition, is being able to detect the keyboard's origin and size, at any time and observe its changes. This would greatly help to improve any user experience related to the keyboard. For example, a feature such as making a text input follow the keyboard whenever it appears or disappears, or being able to drag down/up the keyboard with a simple panning gesture, may seem trivial to any user. Without the notion of the keyboard's frame, it is extremely hard to work around these user expectations.

Therefore, I propose to add a new KVO compliant property to the UIKeyInput protocol:

@property (nonatomic, readonly) CGRect keyboardFrame;

Finally, as part of making the keyboard's user experience greater, I propose to add two new flags to the UITextInput protocol, for handling special text input's cursor states triggered by user gestures:

@property (nonatomic, readonly) BOOL isTrackpadEnabled;       // YES if the keyboard track pad has been recognised.
@property (nonatomic, readonly) BOOL isLoupeVisible;          // YES if the magnifying glass is visible.

Knowing those 2 states would, for example, help skipping any text processing on a text input while the user freely moves the cursor.


Please note: Reports posted here will not necessarily be seen by Apple. All problems should be submitted at before they are posted here. Please only post information for Radars that you have filed yourself, and please do not include Apple confidential information in your posts. Thank you!