An Android Studio Java Custom Gesture Recognition Tutorial

The previous chapter covered the detection of what is referred to as “common gestures” from within an Android application. In practice, however, a gesture can conceivably involve just about any sequence of touch motions on the display of an Android device. In recognition of this, the Android SDK allows custom gestures of just about any nature to be defined by the application developer and used to trigger events when performed by the user. This is a multi-stage process, the details of which are the topic of this chapter.

The Android Gesture Builder Application

The Android SDK allows developers to design custom gestures stored in a gesture file bundled with an Android application package. These custom gesture files are most easily created using the Gesture Builder application. Creating a gestures file involves launching the Gesture Builder application on a physical device or emulator and “drawing” the gestures that will need to be detected by the application. Once the gestures have been designed, the file containing the gesture data can be downloaded and added to the application project. Within the application code, the file is loaded into an instance of the GestureLibrary class, which can be used to search for matches to any gestures the user performs on the device display.

The GestureOverlayView Class

To facilitate the detection of gestures within an application, the Android SDK provides the GestureOverlayView class. This transparent view can be placed over other views in the user interface to detect gestures.

Detecting Gestures

Gestures are detected by loading the gestures file created using the Gesture Builder app and then registering a GesturePerformedListener event listener on an instance of the GestureOverlayView class. The enclosing class is then declared to implement both the OnGesturePerformedListener interface and the corresponding onGesturePerformed callback method required by that interface. If the listener detects a gesture, the Android runtime system triggers a call to the onGesturePerformed callback method.

Identifying Specific Gestures

When a gesture is detected, the onGesturePerformed callback method is called and passed as arguments a reference to the GestureOverlayView object on which the gesture was detected, together with a Gesture object containing information about the gesture.

 

You are reading a sample chapter from Android Studio Hedgehog Essentials – Java Edition.

Buy the full book now in eBook (PDF) or Print format.

The full book contains 87 chapters and over 806 pages of in-depth information.

Learn more.

Preview  Buy eBook  Buy Print

 

With access to the Gesture object, the GestureLibrary can compare the detected gesture to those contained in the gestures file previously loaded into the application. The GestureLibrary reports the probability that the gesture performed by the user matches an entry in the gestures file by calculating a prediction score for each gesture. A prediction score of 1.0 or greater is generally accepted as a good match between a gesture stored in the file and that performed by the user on the device display.

Installing and Running the Gesture Builder Application

The easiest way to create a gestures file is to use an app allowing gesture motions to be captured and saved. Although Google originally provided an app for this purpose, it has not been maintained adequately for use on more recent versions of Android. Fortunately, an alternative is available in the form of the Gesture Builder Tool app, which is available from the Google Play Store at the following URL:

https://play.google.com/store/apps/details?id=migueldp.runeforge

Creating a Gestures File

Once the Gesture Builder Tool has loaded, click on the Create New Gesture button at the bottom of the device screen and “draw” a gesture using a circular motion on the gray canvas, as illustrated in Figure 36-1. Assuming that the gesture appears as required (represented by the yellow line on the device screen), click on the save button to add the gesture to the gestures file, entering “Circle Gesture” when prompted for a name:

Figure 29-1

After the gesture has been saved, the Gesture Builder Tool will display a list of currently defined gestures that will consist solely of the new Circle Gesture.

 

You are reading a sample chapter from Android Studio Hedgehog Essentials – Java Edition.

Buy the full book now in eBook (PDF) or Print format.

The full book contains 87 chapters and over 806 pages of in-depth information.

Learn more.

Preview  Buy eBook  Buy Print

 

Creating the Example Project

Select the New Project option from the welcome screen and, within the resulting new project dialog, choose the Empty Views Activity template before clicking on the Next button.

Enter CustomGestures into the Name field and specify com.ebookfrenzy.customgestures as the package name. Before clicking the Finish button, change the Minimum API level setting to API 26: Android 8.0 (Oreo) and the Language menu to Java. Adapt the project to use view binding as outlined in Android View Binding in Java.

Extracting the Gestures File from the SD Card

As each gesture was created within the Gesture Builder application, it was added to a file named gesture.txt, located in the storage of the emulator or device on which the app was running. However, before this file can be added to an Android Studio project, it must be copied off the device storage and saved to the local file system. This is most easily achieved using the Android Studio Device File Explorer tool window. Display this tool using the View -> Tool Windows -> Device File Explorer menu option. Once displayed, select the device or emulator on which the gesture file was created from the drop-down menu, then navigate through the filesystem to the following folder:

/storage/emulated/0/Android/data/migueldp.runeforge/files/gestures.txtCode language: plaintext (plaintext)

Locate the gesture.txt file in this folder, right-click on it, select the Save As… menu option, and save the file to a temporary location as a file named gestures.

Figure 29-2

Once the gestures file has been created and pulled from the device storage, it can be added to an Android Studio project as a resource file.

 

You are reading a sample chapter from Android Studio Hedgehog Essentials – Java Edition.

Buy the full book now in eBook (PDF) or Print format.

The full book contains 87 chapters and over 806 pages of in-depth information.

Learn more.

Preview  Buy eBook  Buy Print

 

Adding the Gestures File to the Project

Within the Android Studio Project tool window, locate and right-click on the res folder (located under app) and select New -> Directory from the resulting menu. In the New Directory dialog, enter raw as the folder name and tap the keyboard enter key. Using the appropriate file explorer utility for your operating system type, locate the gestures file previously pulled from the device storage and copy and paste it into the new raw folder in the Project tool window.

Designing the User Interface

This example application calls for a user interface consisting of a ConstraintLayout view with a GestureOverlayView layered on it to intercept any gestures the user performs. Locate the app -> res -> layout -> activity_main.xml file, double-click on it to load it into the Layout Editor tool, and select and delete the default TextView widget. Switch the layout editor Code mode and modify the XML so that it reads as follows:

<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout 
    xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".MainActivity" >
 
    <android.gesture.GestureOverlayView
        android:id="@+id/gOverlay"
        android:layout_width="0dp"
        android:layout_height="0dp"
        app:layout_constraintBottom_toBottomOf="parent"
        app:layout_constraintEnd_toEndOf="parent"
        app:layout_constraintStart_toStartOf="parent"
        app:layout_constraintTop_toTopOf="parent" />
 
</androidx.constraintlayout.widget.ConstraintLayout>Code language: HTML, XML (xml)

Loading the Gestures File

Now that the gestures file has been added to the project, the next step is to write some code to load the file when the activity starts. For this project, the code to achieve this will be added to the MainActivity class as follows:

package com.ebookfrenzy.customgestures;
.
.
import android.gesture.GestureLibraries;
import android.gesture.GestureLibrary;
import android.gesture.GestureOverlayView;
import android.gesture.GestureOverlayView.OnGesturePerformedListener;
 
public class MainActivity extends AppCompatActivity
        implements OnGesturePerformedListener {
 
    private ActivityMainBinding binding; 
    private GestureLibrary gLibrary;
 
    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        binding = ActivityMainBinding.inflate(getLayoutInflater());
        View view = binding.getRoot();
        setContentView(view);
 
        gestureSetup();
    }
 
    private void gestureSetup() {
        gLibrary =
                GestureLibraries.fromRawResource(this,
                        R.raw.gestures);
        if (!gLibrary.load()) {
            finish();
        }
    }
.
.
}Code language: Java (java)

In addition to some necessary import directives, the above code also creates a GestureLibrary instance named gLibrary and then loads into it the contents of the gesture file located in the raw resources folder. The activity class has also been modified to implement the OnGesturePerformedListener interface, which requires adding the onGesturePerformed callback method (which will be created later in this chapter).

Registering the Event Listener

For the activity to receive a notification that the user has performed a gesture on the screen, it is necessary to register the OnGesturePerformedListener event listener on the gLayout view, a reference to which can be obtained using the findViewById method as outlined in the following code fragment:

 

You are reading a sample chapter from Android Studio Hedgehog Essentials – Java Edition.

Buy the full book now in eBook (PDF) or Print format.

The full book contains 87 chapters and over 806 pages of in-depth information.

Learn more.

Preview  Buy eBook  Buy Print

 

private void gestureSetup() {
    gLibrary =
            GestureLibraries.fromRawResource(this,
                    R.raw.gestures);
    if (!gLibrary.load()) {
        finish();
    }
 
    GestureOverlayView gOverlay = findViewById(R.id.gOverlay);
    gOverlay.addOnGesturePerformedListener(this);
}Code language: Java (java)

Implementing the onGesturePerformed Method

All that remains before an initial test run of the application can be performed is to implement the OnGesturePerformed callback method. This is the method that will be called when a gesture is performed on the GestureOverlayView instance:

package com.ebookfrenzy.customgestures;
.
. 
import android.gesture.Prediction;
import android.widget.Toast;
import android.gesture.Gesture;
import java.util.ArrayList;
 
public class MainActivity extends AppCompatActivity implements OnGesturePerformedListener {
 
    private GestureLibrary gLibrary;
.
.
    public void onGesturePerformed(GestureOverlayView overlay, Gesture
            gesture) {
        ArrayList<Prediction> predictions =
                gLibrary.recognize(gesture);
 
        if (predictions.size() > 0 && predictions.get(0).score > 1.0) 
        {
 
            String action = predictions.get(0).name;
 
            Toast.makeText(this, action, Toast.LENGTH_SHORT).show();
        }
    }
.
.
.
}Code language: Java (java)

When the Android runtime detects a gesture on the gesture overlay view object, the onGesturePerformed method is called. Passed through as arguments are a reference to the GestureOverlayView object on which the gesture was detected together with an object of type Gesture. The Gesture class is designed to hold the information that defines a specific gesture (essentially a sequence of timed points on the screen depicting the path of the strokes that comprise a gesture).

The Gesture object is passed through to the recognize() method of our gLibrary instance to compare the current gesture with each gesture loaded from the gesture file. Once this task is complete, the recognize() method returns an ArrayList object containing a Prediction object for each comparison performed. The list is ranked in order from the best match (at position 0 in the array) to the worst. Contained within each prediction object is the name of the corresponding gesture from the gesture file and a prediction score indicating how closely it matches the current gesture.

The code in the above method, therefore, takes the prediction at position 0 (the closest match), makes sure it has a score of greater than 1.0, and then displays a Toast message (an Android class designed to display notification pop-ups to the user) displaying the name of the matching gesture.

Testing the Application

Build and run the application on an emulator or a physical Android device and perform the circle gesture on the display. When performed, the toast notification should appear containing the name of the detected gesture. Note that when a gesture is recognized, it is outlined on the display with a bright yellow line, while gestures about which the overlay is uncertain appear as a faint yellow line. While useful during development, this is probably not ideal for a real-world application. Therefore, there is still some more configuration work to do.

 

You are reading a sample chapter from Android Studio Hedgehog Essentials – Java Edition.

Buy the full book now in eBook (PDF) or Print format.

The full book contains 87 chapters and over 806 pages of in-depth information.

Learn more.

Preview  Buy eBook  Buy Print

 

Configuring the GestureOverlayView

By default, the GestureOverlayView is configured to display yellow lines during gestures. The color that draws recognized and unrecognized gestures can be defined via the android:gestureColor and android:uncertainGestureColor attributes. For example, to hide the gesture lines, modify the activity_main.xml file in the example project as follows:

<android.gesture.GestureOverlayView
    android:id="@+id/gOverlay"
    android:layout_width="0dp"
    android:layout_height="0dp"
    app:layout_constraintBottom_toBottomOf="parent"
    app:layout_constraintEnd_toEndOf="parent"
    app:layout_constraintStart_toStartOf="parent"
    app:layout_constraintTop_toTopOf="parent"
    android:gestureColor="#00000000"
    android:uncertainGestureColor="#00000000" />Code language: HTML, XML (xml)

On re-running the application, gestures should now be invisible (since they are drawn in white on the white background of the ConstraintLayout view).

Intercepting Gestures

The GestureOverlayView is, as previously described, a transparent overlay that may be positioned over the top of other views. This leads to the question of whether events intercepted by the gesture overlay should be passed on to the underlying views when a gesture has been recognized. This is controlled via the android:eventsInterceptionEnabled property of the GestureOverlayView instance. When set to true, the gesture events are not passed to the underlying views when a gesture is recognized. This can be a particularly useful setting when gestures are being performed over a view that might be configured to scroll in response to certain gestures. Setting this property to true will avoid gestures also being interpreted as instructions to the underlying view to scroll in a particular direction.

Detecting Pinch Gestures

Before moving on from touch handling in general and gesture recognition in particular, the last topic of this chapter is handling pinch gestures. While it is possible to create and detect a wide range of gestures using the steps outlined in the previous sections of this chapter, it is, in fact, not possible to detect a pinching gesture (where two fingers are used in a stretching and pinching motion, typically to zoom in and out of a view or image) using the techniques discussed so far.

The simplest method for detecting pinch gestures is to use the Android ScaleGestureDetector class. In general terms, detecting pinch gestures involves the following three steps:

 

You are reading a sample chapter from Android Studio Hedgehog Essentials – Java Edition.

Buy the full book now in eBook (PDF) or Print format.

The full book contains 87 chapters and over 806 pages of in-depth information.

Learn more.

Preview  Buy eBook  Buy Print

 

  1. Declaration of a new class which implements the SimpleOnScaleGestureListener interface, including the required onScale(), onScaleBegin(), and onScaleEnd() callback methods.
  2. Creation of an instance of the ScaleGestureDetector class, passing through an instance of the class created in step 1 as an argument.
  3. Implementing the onTouchEvent() callback method on the enclosing activity, which, in turn, calls the onTouchEvent() method of the ScaleGestureDetector class.

In the remainder of this chapter, we will create an example designed to demonstrate the implementation of pinch gesture recognition.

A Pinch Gesture Example Project

Select the New Project option from the welcome screen and, within the resulting new project dialog, choose the Empty Views Activity template before clicking on the Next button.

Enter PinchExample into the Name field and specify com.ebookfrenzy.pinchexample as the package name. Before clicking on the Finish button, change the Minimum API level setting to API 26: Android 8.0 (Oreo) and the Language menu to Java. Convert the project to use view binding by following the steps in 11.8 Migrating a Project to View Binding.

Within the activity_main.xml file, select the default TextView object and use the Attributes tool window to set the ID to myTextView.

Locate and load the MainActivity.java file into the Android Studio editor and modify the file as follows:

 

You are reading a sample chapter from Android Studio Hedgehog Essentials – Java Edition.

Buy the full book now in eBook (PDF) or Print format.

The full book contains 87 chapters and over 806 pages of in-depth information.

Learn more.

Preview  Buy eBook  Buy Print

 

package com.ebookfrenzy.pinchexample;
.
.
import android.view.MotionEvent;
import android.view.ScaleGestureDetector;
import android.view.ScaleGestureDetector.SimpleOnScaleGestureListener;
 
public class MainActivity extends AppCompatActivity {
 
    private ActivityMainBinding binding;
    ScaleGestureDetector scaleGestureDetector;
 
    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        binding = ActivityMainBinding.inflate(getLayoutInflater());
        View view = binding.getRoot();
        setContentView(view);
 
        scaleGestureDetector =
                new ScaleGestureDetector(this,
                        new MyOnScaleGestureListener());
    }
 
    @Override
    public boolean onTouchEvent(MotionEvent event) {
        scaleGestureDetector.onTouchEvent(event);
        return true;
    }
 
    public class MyOnScaleGestureListener extends
            SimpleOnScaleGestureListener {
 
        @Override
        public boolean onScale(ScaleGestureDetector detector) {
 
            float scaleFactor = detector.getScaleFactor();
 
            if (scaleFactor > 1) {
                binding.myTextView.setText("Zooming In");
            } else {
                binding.myTextView.setText("Zooming Out");
            }
            return true;
        }
 
        @Override
        public boolean onScaleBegin(ScaleGestureDetector detector) {
            return true;
        }
 
        @Override
        public void onScaleEnd(ScaleGestureDetector detector) {
 
        }
    }
.
.
.
}Code language: Java (java)

The code declares a new class named MyOnScaleGestureListener, extending the Android

SimpleOnScaleGestureListener class. This interface requires that three methods (onScale(), onScaleBegin(), and onScaleEnd()) be implemented. In this instance, the onScale() method identifies the scale factor and displays a message on the text view indicating the type of pinch gesture detected.

Within the onCreate() method, a new ScaleGestureDetector instance is created, passing through a reference to the enclosing activity and an instance of our new MyOnScaleGestureListener class as arguments. Finally, an onTouchEvent() callback method is implemented for the activity, which calls the corresponding onTouchEvent() method of the ScaleGestureDetector object, passing through the MotionEvent object as an argument.

Compile and run the application on an emulator or physical Android device and perform pinching gestures on the screen, noting that the text view displays either the zoom-in or zoom-out message depending on the pinching motion. Pinching gestures may be simulated within the emulator in stand-alone mode by holding down the Ctrl (or macOS Cmd) key and clicking and dragging the mouse pointer, as shown in Figure 36-3:

Figure 29-3

Summary

A gesture is the motion of points of contact on a touch screen involving one or more strokes and can be used as a method of communication between the user and the application. Android allows gestures to be designed using the Gesture Builder application. Once created, gestures can be saved to a gestures file and loaded into an activity at application runtime using the GestureLibrary.

 

You are reading a sample chapter from Android Studio Hedgehog Essentials – Java Edition.

Buy the full book now in eBook (PDF) or Print format.

The full book contains 87 chapters and over 806 pages of in-depth information.

Learn more.

Preview  Buy eBook  Buy Print

 

Gestures can be detected on areas of the display by overlaying existing views with instances of the transparent GestureOverlayView class and implementing an OnGesturePerformedListener event listener. Using the GestureLibrary, a ranked list of matches between a gesture performed by the user and the gestures stored in a gestures file may be generated, using a prediction score to decide whether a gesture is a close enough match.

Pinch gestures may be detected by implementing the ScaleGestureDetector class, an example of which is provided in the next chapter.


Categories ,