Difference between revisions of "An iPhone iOS 6 Gesture Recognition Tutorial"

From Techotopia
Jump to: navigation, search
(New page: <table border="0" cellspacing="0" width="100%"> <tr> <td width="20%">Previous<td align="center">[[iPhone iOS 6 Development E...)
 
m (Text replacement - "<table border="0" cellspacing="0">" to "<table border="0" cellspacing="0" width="100%">")
 
(6 intermediate revisions by the same user not shown)
Line 8: Line 8:
  
  
<google>BUY_IOS6</google>
+
<htmlet>ios9_upgrade</htmlet>
  
  
Line 29: Line 29:
 
Next, the non-visual gesture recognizer objects need to be added to the design. Scroll down the list of objects in the Object Library panel until the Tap Gesture Recognizer object comes into view. Drag and drop the object onto the View in the design area (if the object is dropped outside the view, the connection between the recognizer and the view on which the gestures are going to be performed will not be established). Repeat these steps to add Pinch, Rotation, Swipe and Long Press Gesture Recognizer objects to the design. Note that document outline panel has updated to reflect the presence of the gesture recognizer objects as illustrated in Figure 47-1:
 
Next, the non-visual gesture recognizer objects need to be added to the design. Scroll down the list of objects in the Object Library panel until the Tap Gesture Recognizer object comes into view. Drag and drop the object onto the View in the design area (if the object is dropped outside the view, the connection between the recognizer and the view on which the gestures are going to be performed will not be established). Repeat these steps to add Pinch, Rotation, Swipe and Long Press Gesture Recognizer objects to the design. Note that document outline panel has updated to reflect the presence of the gesture recognizer objects as illustrated in Figure 47-1:
  
+
<htmlet>adsdaqbox_flow</htmlet>
 +
[[Image:ios_6_gesture_recognizers.png|Gesture Recognizers added to an iOS 6 iPhone app design]]
  
 
Figure 47-1
 
Figure 47-1
Line 40: Line 41:
 
Having added and configured the gesture recognizers, the next step is to connect each recognizer to its corresponding action method.  
 
Having added and configured the gesture recognizers, the next step is to connect each recognizer to its corresponding action method.  
  
Display the Assistant Editor and verify that it is displaying the content of RecognizerViewController.h. Ctrl-click on the Tap Gesture Recognizer object in the document outline panel and drag the line to the area immediately beneath the newly created outlet in the Assistant Editor panel. Release the line and, within the resulting connection dialog, establish an Action method configured to call a method named tapDetected with the id value set to UITapGestureRecognizer as illustrated in Figure 47 2:
+
Display the Assistant Editor and verify that it is displaying the content of RecognizerViewController.h. Ctrl-click on the Tap Gesture Recognizer object in the document outline panel and drag the line to the area immediately beneath the newly created outlet in the Assistant Editor panel. Release the line and, within the resulting connection dialog, establish an Action method configured to call a method named tapDetected with the id value set to UITapGestureRecognizer as illustrated in Figure 47-2:
 
   
 
   
  
 +
[[Image:ios_6_gesture_connection.png|Configuring a Gesture Recognizer connection]]
  
 
Figure 47-2
 
Figure 47-2
Line 113: Line 115:
  
  
<google>BUY_IOS6</google>
+
<htmlet>ios9_upgrade</htmlet>
  
  

Latest revision as of 20:01, 27 October 2016

PreviousTable of ContentsNext
Identifying iPhone Gestures using iOS 6 Gesture RecognizersAn Overview of iOS 6 Collection View and Flow Layout


Learn SwiftUI and take your iOS Development to the Next Level
SwiftUI Essentials – iOS 16 Edition book is now available in Print ($39.99) and eBook ($29.99) editions. Learn more...

Buy Print Preview Book


Having covered the theory of gesture recognition on the iPhone in the chapter entitled Identifying iPhone Gestures using iOS 6 Gesture Recognizers, the purpose of this chapter is to work through an example application intended to demonstrate the use of the various UIGestureRecognizer subclasses.

The application created in this chapter will configure recognizers to detect a number of different gestures on the iPhone display and update a status label with information about each recognized gesture.


Contents


Creating the Gesture Recognition Project

Begin by invoking Xcode and creating a new iOS iPhone application project using the Single View Application template and name the project and class prefix Recognizer with Storyboard and the Automatic Reference Counting options enabled.

Designing the User Interface

The only visual component that will be present on our UIView object will be the label used to notify the user of the type of gesture detected. Since the text displayed on this label will need to be updated from within the application code it will need to be connected to an outlet. In addition, the view controller will also contain five gesture recognizer objects to detect pinches, taps, rotations, swipes and long presses. When triggered, these objects will need to call action methods in order to update the label with a notification to the user that the corresponding gesture has been detected.

Select the MainStoryboard.storyboard file and drag a Label object from the Object Library panel to the center of the view. Once positioned, stretch the label horizontally to the outer edges of the view until the blue dotted lines representing the recommended margins appear and then modify the label properties to center the label text.

Select the label object in the view canvas, display the Assistant Editor panel and verify that the editor is displaying the contents of the RecognizerViewController.h file. Ctrl-click on the same label object and drag to a position just below the @interface line in the Assistant Editor. Release the line and in the resulting connection dialog establish an outlet connection named statusLabel.

Next, the non-visual gesture recognizer objects need to be added to the design. Scroll down the list of objects in the Object Library panel until the Tap Gesture Recognizer object comes into view. Drag and drop the object onto the View in the design area (if the object is dropped outside the view, the connection between the recognizer and the view on which the gestures are going to be performed will not be established). Repeat these steps to add Pinch, Rotation, Swipe and Long Press Gesture Recognizer objects to the design. Note that document outline panel has updated to reflect the presence of the gesture recognizer objects as illustrated in Figure 47-1:

Gesture Recognizers added to an iOS 6 iPhone app design

Figure 47-1


Within the document outline panel, select the Tap Gesture Recognizer instance and display the Attributes Inspector (View -> Utilities -> Show Attributes Inspector). Within the attributes panel, change the Taps value to 2 so that only double taps are detected.

Similarly, select the Long Press Recognizer object and change the Press Duration attribute to 3 seconds.

Having added and configured the gesture recognizers, the next step is to connect each recognizer to its corresponding action method.

Display the Assistant Editor and verify that it is displaying the content of RecognizerViewController.h. Ctrl-click on the Tap Gesture Recognizer object in the document outline panel and drag the line to the area immediately beneath the newly created outlet in the Assistant Editor panel. Release the line and, within the resulting connection dialog, establish an Action method configured to call a method named tapDetected with the id value set to UITapGestureRecognizer as illustrated in Figure 47-2:


Configuring a Gesture Recognizer connection

Figure 47-2


Repeat these steps to establish action connections for the pinch, rotation, swipe and long press gesture recognizers to methods named pinchDetected, rotationDetected, swipeDetected and longPressDetected respectively, taking care to select the corresponding id value for each action.

On completion of the above steps, the RecognizerViewController.h should read as follows:

#import <UIKit/UIKit.h>

@interface RecognizerViewController : UIViewController
@property (strong, nonatomic) IBOutlet UILabel *statusLabel;
- (IBAction)tapDetected:(UITapGestureRecognizer *)sender;
- (IBAction)rotationDetected:(UIRotationGestureRecognizer *)sender;
- (IBAction)pinchDetected:(UIPinchGestureRecognizer *)sender;
- (IBAction)swipeDetected:(UISwipeGestureRecognizer *)sender;
- (IBAction)longPressDetected:(UILongPressGestureRecognizer *)sender;

@end

Implementing the Action Methods

Having configured the gesture recognizers, the next step is to write the action methods that will be called by each recognizer when the corresponding gesture is detected. The methods stubs created by Xcode reside in the RecognizerViewController.m file and will update the status label with information about the detected gesture:

- (IBAction)longPressDetected:(UIGestureRecognizer *)sender {
    _statusLabel.text = @"Long Press";
}

- (IBAction)swipeDetected:(UIGestureRecognizer *)sender {
    _statusLabel.text = @"Right Swipe";
}

- (IBAction)tapDetected:(UIGestureRecognizer *)sender {
    _statusLabel.text = @"Double Tap";
}

- (IBAction)pinchDetected:(UIGestureRecognizer *)sender {

    CGFloat scale =
       [(UIPinchGestureRecognizer *)sender scale];
    CGFloat velocity =
       [(UIPinchGestureRecognizer *)sender velocity];

    NSString *resultString = [[NSString alloc] initWithFormat:
         @"Pinch - scale = %f, velocity = %f",
         scale, velocity];
    _statusLabel.text = resultString;
}

- (IBAction)rotationDetected:(UIGestureRecognizer *)sender {
    CGFloat radians =
          [(UIRotationGestureRecognizer *)sender rotation];
    CGFloat velocity =
          [(UIRotationGestureRecognizer *)sender velocity];

    NSString *resultString = [[NSString alloc] initWithFormat:
              @"Rotation - Radians = %f, velocity = %f",
              radians, velocity];
    _statusLabel.text = resultString;
}

Testing the Gesture Recognition Application

The final step is to build and run the application. In order to fully test the pinching and rotation recognition it will be necessary to run the application on a physical device (since it is not possible to emulate two simultaneous touches within the iOS Simulator environment). Assuming a provisioned device is attached (see Testing iOS 6 Apps on the iPhone – Developer Certificates and Provisioning Profiles for more details) simply click on the Xcode Run button. Once the application loads on the device, perform the appropriate gestures on the display and watch the status label update accordingly.


Learn SwiftUI and take your iOS Development to the Next Level
SwiftUI Essentials – iOS 16 Edition book is now available in Print ($39.99) and eBook ($29.99) editions. Learn more...

Buy Print Preview Book


PreviousTable of ContentsNext
Identifying iPhone Gestures using iOS 6 Gesture RecognizersAn Overview of iOS 6 Collection View and Flow Layout