An Overview of iOS 10 Multitouch, Taps and Gesture

From Techotopia
Revision as of 13:08, 4 November 2016 by Neil (Talk | contribs) (Touch Notification Methods)

Jump to: navigation, search

PreviousTable of ContentsNext
An iOS 8 CloudKit Subscription ExampleAn Example Swift iOS 8 Touch, Multitouch and Tap Application


Learn SwiftUI and take your iOS Development to the Next Level
SwiftUI Essentials – iOS 16 Edition book is now available in Print ($39.99) and eBook ($29.99) editions. Learn more...

Buy Print Preview Book


In terms of physical points of interaction between the device and the user, the iPhone and iPad provide four buttons, a switch and a touch screen. Without question, the user will spend far more time using the touch screen than any other aspect of the device. It is essential, therefore, that any application be able to handle gestures (touches, multitouches, taps, swipes and pinches etc) performed by the user’s fingers on the touch screen.

Before writing code to handle these gestures, this chapter will spend some time talking about the responder chain in relation to touch screen events before delving a little deeper into the types of gestures an iOS application is likely to encounter.


Contents


The Responder Chain

In the chapter entitled Understanding iOS 10 Views, Windows and the View Hierarchy we spent some time talking about the view hierarchy of an application’s user interface and how that hierarchy also defined part of the application’s responder chain. In order to fully understand the concepts behind the handling of touch screen gestures it is first necessary to spend a little more time learning about the responder chain.

When the user interacts with the touch screen of an iPhone or iPad the hardware detects the physical contact and notifies the operating system. The operating system subsequently creates an event associated with the interaction and passes it into the currently active application’s event queue where it is subsequently picked up by the event loop and passed to the current first responder object; the first responder being the object with which the user was interacting when this event was triggered (for example a UIButton or UIView object). If the first responder has been programmed to handle the type of event received it does so (for example a button may have an action defined to call a particular method when it receives a touch event). Having handled the event, the responder then has the option of discarding that event, or passing it up to the next responder in the response chain (defined by the object’s next property) for further processing, and so on up the chain. If the first responder is not able to handle the event it will also pass it to the next responder in the chain and so on until it either reaches a responder that handles the event or it reaches the end of the chain (the UIApplication object) where it will either be handled or discarded.

Take, for example, a UIView with a UIButton subview. If the user touches the screen over the button then the button, as first responder, will receive the event. If the button is unable to handle the event it will need to be passed up to the view object. If the view is also unable to handle the event it would then be passed to the view controller and so on.

When working with the responder chain, it is important to note that the passing of an event from one responder to the next responder in the chain does not happen automatically. If an event needs to be passed to the next responder, code must be written to make it happen.

Forwarding an Event to the Next Responder

To pass an event to the next responder in the chain, a reference to the next responder object must first be obtained. This can be achieved by accessing the next property of the current responder. Once the next responder has been identified, the method that triggered the event is called on that object and passed any relevant event data.

Take, for example, a situation where the current responder object is unable to handle a touchesBegan event. In order to pass this to the next responder, the touchesBegan method of the current responder will need to make a call as follows:

override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
        self.next?.touchesBegan(touches, with: event)
}

In this case, the touchesBegan method is called on the next responder and passed the original touches and event parameters.


Gestures

Gesture is an umbrella term used to encapsulate any single interaction between the touch screen and the user, starting at the point that the screen is touched (by one or more fingers) and the time that the last finger leaves the surface of the screen. Swipes, pinches, stretches and flicks are all forms of gesture.

Taps

A tap, as the name suggests, occurs when the user touches the screen with a single finger and then immediately lifts it from the screen. Taps can be single-taps or multiple-taps and the event will contain information about the number of times a user tapped on the screen.

Touches

A touch occurs when a finger establishes contact with the screen. When more than one finger touches the screen each finger registers as a touch up to a maximum of five fingers.

Touch Notification Methods

Touch screen events cause one of four methods on the first responder object to be called. The method that gets called for a specific event will depend on the nature of the interaction. In order to handle events, therefore, it is important to ensure that the appropriate methods from those outlined below are implemented within your responder chain. These methods will be used in the worked example contained in the An Example iOS 10 Touch, Multitouch and Tap Application and Detecting iOS 10 Touch Screen Gesture Motions chapters of this book.

touchesBegan method

The touchesBegan method is called when the user first touches the screen. Passed to this method are an argument called touches of type NSSet and the corresponding UIEvent object. The touches object contains a UITouch event for each finger in contact with the screen. The tapCount method of any of the UITouch events within the touches set can be called to identify the number of taps, if any, performed by the user. Similarly, the coordinates of an individual touch can be identified from the UITouch event either relative to the entire screen or within the local view itself.

touchesMoved method

The touchesMoved method is called when one or more fingers move across the screen. As fingers move across the screen this method gets called multiple times allowing the application to track the new coordinates and touch count at regular intervals. As with the touchesBegan method, this method is provided with an event object and an NSSet object containing UITouch events for each finger on the screen.

touchesEnded method

This method is called when the user lifts one or more fingers from the screen. As with the previous methods, touchesEnded is provided with the event and NSSet objects.

touchesCancelled method

When a gesture is interrupted due to a high level interrupt, such as the phone detecting an incoming call, the touchesCancelled method is called.

Touch Prediction

A feature introduced as part of the iOS 9 SDK is touch prediction. Each time the system updates the current coordinates of a touch on the screen, a set of algorithms attempt to predict the coordinates of the next location. A finger sweeping across the screen, for example, will trigger multiple calls to the touchesMoved method passing through the current touch coordinates. Also passed through to the method is a UIEvent object on which a method named predictedTouchesForTouch may be called, passing through the touch object representing the current location. In return, the method will provide an array of UITouch objects that predict the next few locations of the touch motion. This information can then be used to improve the performance and responsiveness of the app to the touch behavior of the user.

Touch Coalescing

Most iOS devices scan for touches on the display with a frequency of 60 Hz (in other words 60 times every second). This is not the case, however, for the iPad Air 2 which performs touch scans at twice the rate (120 Hz). This means that apps running on an iPad Air are able to receive touch information with twice the level of precision of other iOS devices.

Rather than call the touch notification methods with an increased frequency on iPad Air 2 devices, however, UIKit uses a system referred to as touch coalescing to deliver the additional touch data while calling the methods at the usual rate.

With touch coalescing, the same touch notification methods are called and passed the same UITouch objects which are referred to as the main touches. Also passed through to each method is a UIEvent object on which the coalescedTouchesForTouch method may be called, passing through as an argument the current main touch object. When called within an app running on an iPad Air 2 device, the method will return an array of touch objects consisting of both a copy of the current main touch together with the intermediate touch activity between the current main touch and the previous main touch. These intermediate touch objects are referred to as coalesced touches. On iOS devices with a 60 Hz touch scan rate, no coalesced touch objects will be returned by this method call.

3D Touch

3D Touch is a hardware feature introduced with the iPhone 6s family of devices. 3D Touch uses a capacitive layer behind the device screen to gauge the amount of pressure that is being applied to the device display during a touch. The presence or otherwise of support for 3D Touch on an iOS device can be identified by checking the forceTouchCapability property of the trait collection on a view controller. The following code, for example, checks for 3D Touch support:

if traitCollection.forceTouchCapability == .available {
    // 3D Touch is available on device
} else {
    // 3D Touch is not available on device
}

When handling touches on a 3D Touch capable device, the force property of the UITouch events passed to the touch notification methods described above will contain a value representing the force currently being applied to the display by a touch. This topic is covered in detail in the chapter entitled A 3D Touch Force Handling Tutorial.

3D Touch also allows two additional features, referred to as Home Screen Quick Actions and Peek and Pop, to be built into iOS apps, details of which are covered in the chapters entitled An iOS 10 3D Touch Quick Actions Tutorial and An iOS 10 3D Touch Peek and Pop Tutorial.

Summary

In order to fully appreciate the mechanisms for handling touch screen events within an iOS application, it is first important to understand both the responder chain and the methods that are called on a responder depending on the type of interaction. We have covered these basics in this chapter. In the next chapter, entitled An Example iOS 10 Touch, Multitouch and Tap Application we will use these concepts to create an example application that demonstrates touch screen event handling.


Learn SwiftUI and take your iOS Development to the Next Level
SwiftUI Essentials – iOS 16 Edition book is now available in Print ($39.99) and eBook ($29.99) editions. Learn more...

Buy Print Preview Book



PreviousTable of ContentsNext
An iOS 8 CloudKit Subscription ExampleAn Example Swift iOS 8 Touch, Multitouch and Tap Application