An Overview of iOS 9 Multitouch, Taps and Gestures

PreviousTable of ContentsNext
An iOS 9 CloudKit Subscription ExampleAn Example iOS 9 Touch, Multitouch and Tap Application


Learn SwiftUI and take your iOS Development to the Next Level
SwiftUI Essentials – iOS 16 Edition book is now available in Print ($39.99) and eBook ($29.99) editions. Learn more...

Buy Print Preview Book


In terms of physical points of interaction between the device and the user, the iPhone and iPad provide four buttons, a switch and a touch screen. Without question, the user will spend far more time using the touch screen than any other aspect of the device. It is essential, therefore, that any application be able to handle gestures (touches, multitouches, taps, swipes and pinches etc) performed by the user’s fingers on the touch screen. Before writing code to handle these gestures, this chapter will spend some time talking about the responder chain in relation to touch screen events before delving a little deeper into the types of gestures an iOS application is likely to encounter.

The Responder Chain

In the chapter entitled Understanding iOS 9 Views, Windows and the View Hierarchy we spent some time talking about the view hierarchy of an application’s user interface and how that hierarchy also defined part of the application’s responder chain. In order to fully understand the concepts behind the handling of touch screen gestures it is first necessary to spend a little more time learning about the responder chain.

When the user interacts with the touch screen of an iPhone or iPad the hardware detects the physical contact and notifies the operating system. The operating system subsequently creates an event associated with the interaction and passes it into the currently active application’s event queue where it is subsequently picked up by the event loop and passed to the current first responder object; the first responder being the object with which the user was interacting when this event was triggered (for example a UIButton or UIView object). If the first responder has been programmed to handle the type of event received it does so (for example a button may have an action defined to call a particular method when it receives a touch event). Having handled the event, the responder then has the option of discarding that event, or passing it up to the next responder in the response chain (defined by the object’s nextResponder property) for further processing, and so on up the chain. If the first responder is not able to handle the event it will also pass it to the next responder in the chain and so on until it either reaches a responder that handles the event or it reaches the end of the chain (the UIApplication object) where it will either be handled or discarded.

Take, for example, a UIView with a UIButton subview. If the user touches the screen over the button then the button, as first responder, will receive the event. If the button is unable to handle the event it will need to be passed up to the view object. If the view is also unable to handle the event it would then be passed to the view controller and so on.

Learn SwiftUI and take your iOS Development to the Next Level
SwiftUI Essentials – iOS 16 Edition book is now available in Print ($39.99) and eBook ($29.99) editions. Learn more...

Buy Print Preview Book

When working with the responder chain, it is important to note that the passing of an event from one responder to the next responder in the chain does not happen automatically. If an event needs to be passed to the next responder, code must be written to make it happen.

Forwarding an Event to the Next Responder

An event may be passed on to the next responder in the response chain by calling the nextResponder method of the current responder, passing through the method that was triggered by the event and the event itself. Take, for example, a situation where the current responder object is unable to handle a touchesBegan event. In order to pass this to the next responder, the touchesBegan method of the current responder will need to make a call as follows:

override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
    self.nextResponder()
}

Gestures

Gesture is an umbrella term used to encapsulate any single interaction between the touch screen and the user, starting at the point that the screen is touched (by one or more fingers) and the time that the last finger leaves the surface of the screen. Swipes, pinches, stretches and flicks are all forms of gesture.

Taps

A tap, as the name suggests, occurs when the user touches the screen with a single finger and then immediately lifts it from the screen. Taps can be single-taps or multiple-taps and the event will contain information about the number of times a user tapped on the screen.

Touches

A touch occurs when a finger establishes contact with the screen. When more than one finger touches the screen each finger registers as a touch up to a maximum of five fingers.

Touch Notification Methods

Touch screen events cause one of four methods on the first responder object to be called. The method that gets called for a specific event will depend on the nature of the interaction. In order to handle events, therefore, it is important to ensure that the appropriate methods from those outlined below are implemented within your responder chain. These methods will be used in the worked example contained in the An Example iOS 9 Touch, Multitouch and Tap Application and Detecting iOS 9 Touch Screen Gesture Motions chapters of this book.

touchesBegan method

The touchesBegan method is called when the user first touches the screen. Passed to this method are an argument called touches of type NSSet and the corresponding UIEvent object. The touches object contains a UITouch event for each finger in contact with the screen. The tapCount method of any of the UITouch events within the touches set can be called to identify the number of taps, if any, performed by the user. Similarly, the coordinates of an individual touch can be identified from the UITouch event either relative to the entire screen or within the local view itself.

touchesMoved method

The touchesMoved method is called when one or more fingers move across the screen. As fingers move across the screen this method gets called multiple times allowing the application to track the new coordinates and touch count at regular intervals. As with the touchesBegan method, this method is provided with an event object and an NSSet object containing UITouch events for each finger on the screen.

touchesEnded method

This method is called when the user lifts one or more fingers from the screen. As with the previous methods, touchesEnded is provided with the event and NSSet objects.

touchesCancelled method

When a gesture is interrupted due to a high level interrupt, such as the phone detecting an incoming call, the touchesCancelled method is called.

Touch Prediction

A new feature introduced as part of the iOS 9 SDK is touch prediction. Each time the system updates the current coordinates of a touch on the screen, a set of algorithms attempt to predict the coordinates of the next location. A finger sweeping across the screen, for example, will trigger multiple calls to the touchesMoved method passing through the current touch coordinates. Also passed through to the method is a UIEvent object on which a method named predictedTouchesForTouch may be called, passing through the touch object representing the current location. In return, the method will provide an array of UITouch objects that predict the next few locations of the touch motion. This information can then be used to improve the performance and responsiveness of the app to the touch behavior of the user.

Learn SwiftUI and take your iOS Development to the Next Level
SwiftUI Essentials – iOS 16 Edition book is now available in Print ($39.99) and eBook ($29.99) editions. Learn more...

Buy Print Preview Book

Touch Coalescing

Most iOS devices scan for touches on the display with a frequency of 60 Hz (in other words 60 times every second). This is not the case, however, for the iPad Air 2 which performs touch scans at twice the rate (120 Hz). This means that apps running on an iPad Air are able to receive touch information with twice the level of precision of other iOS devices.

Rather than call the touch notification methods with an increased frequency on iPad Air 2 devices, however, UIKit uses a system referred to as touch coalescing to deliver the additional touch data while calling the methods at the usual rate.

With touch coalescing, the same touch notification methods are called and passed the same UITouch objects which are referred to as the main touches. Also passed through to each method is a UIEvent object on which the coalescedTouchesForTouch method may be called, passing through as an argument the current main touch object. When called within an app running on an iPad Air 2 device, the method will return an array of touch objects consisting of both a copy of the current main touch together with the intermediate touch activity between the current main touch and the previous main touch. These intermediate touch objects are referred to as coalesced touches. On iOS devices with a 60 Hz touch scan rate, no coalesced touch objects will be returned by this method call.

3D Touch

3D Touch is a hardware feature introduced with the iPhone 6s family of devices. 3D Touch uses a capacitive layer behind the device screen to gauge the amount of pressure that is being applied to the device display during a touch. The presence or otherwise of support for 3D Touch on an iOS device can be identified by checking the forceTouchCapability property of the trait collection on a view controller. The following code, for example, checks for 3D Touch support:

if traitCollection.forceTouchCapability == .Available {
    // 3D Touch is available on device
} else {
    // 3D Touch is not available on device
}

When handling touches on a 3D Touch capable device, the force property of the UITouch events passed to the touch notification methods described above will contain a value representing the force currently being applied to the display by a touch. This topic is covered in detail in the chapter entitled A 3D Touch Force Handling Tutorial.

3D Touch also allows two additional features, referred to as Home Screen Quick Actions and Peek and Pop, to be built into iOS 9 apps, details of which are covered in the chapters entitled An iOS 9 3D Touch Quick Actions Tutorial and An iOS 9 3D Touch Peek and Pop Tutorial.

iPad Pro and the Apple Pencil Stylus

The UITouch class has been extended for iOS 9.1 to include support for the iPad Pro and Apple Pencil. These extensions take the form of additional properties and methods that can be used to obtain information about the point of contact and angle of the pencil stylus in relation to the iPad Pro display. These methods and properties are as follows:

  • altitudeAngle - A CGFloat value indicating in radians the angle of the Apple Pencil stylus in relation to the iPad Pro display. A value of 0 indicates that the stylus is parallel to the display surface.
  • azimuthAngleInView – A method which takes as an argument the view object for which stylus data is required. The method returns a CGFloat value indicating the azimuth angle of the stylus in radians.
  • azimuthUnitVectorInView - A method which takes as an argument the view object for which stylus data is required. The method returns a CGVector value indicating the direction of the azimuth of the stylus.

Summary

In order to fully appreciate the mechanisms for handling touch screen events within an iOS 9 application, it is first important to understand both the responder chain and the methods that are called on a responder depending on the type of interaction. We have covered these basics in this chapter. In the next chapter, entitled An Example iOS 9 Touch, Multitouch and Tap Application we will use these concepts to create an example application that demonstrates touch screen event handling.


Learn SwiftUI and take your iOS Development to the Next Level
SwiftUI Essentials – iOS 16 Edition book is now available in Print ($39.99) and eBook ($29.99) editions. Learn more...

Buy Print Preview Book



PreviousTable of ContentsNext
An iOS 9 CloudKit Subscription ExampleAn Example iOS 9 Touch, Multitouch and Tap Application