Difference between revisions of "Implementing Android Custom Gesture Recognition with Android Studio"

From Techotopia
Jump to: navigation, search
(New page: <br><br> The previous chapter looked at the steps involved in detecting what are referred to as “common gestures” from within an Android application. In practice, however, a gesture ca...)
 
(= Extracting the Gestures File from the SD Card)
Line 74: Line 74:
  
  
== Extracting the Gestures File from the SD Card =
+
== Extracting the Gestures File from the SD Card ==
  
 
As each gesture was created within the Gesture Builder application, it was added to a file named gestures located on the SD Card of the emulator or device on which the app was running. Before this file can be added to an Android Studio project, however, it must first be pulled off the SD Card and saved to the local file system. This is most easily achieved by using the adb command-line tool. Open a Terminal or Command Prompt window, therefore, and execute the following command:
 
As each gesture was created within the Gesture Builder application, it was added to a file named gestures located on the SD Card of the emulator or device on which the app was running. Before this file can be added to an Android Studio project, however, it must first be pulled off the SD Card and saved to the local file system. This is most easily achieved by using the adb command-line tool. Open a Terminal or Command Prompt window, therefore, and execute the following command:

Revision as of 14:46, 6 May 2014



The previous chapter looked at the steps involved in detecting what are referred to as “common gestures” from within an Android application. In practice, however, a gesture can conceivably involve just about any sequence of touch motions on the display of an Android device. In recognition of this fact, the Android SDK allows custom gestures of just about any nature to be defined by the application developer and used to trigger events when performed by the user. This is a multistage process, the details of which are the topic of this chapter.


Contents


The Android Gesture Builder Application

The Android SDK allows developers to design custom gestures which are then stored in a gesture file bundled with an Android application package. These custom gesture files are most easily created using the Gesture Builder application which is bundled with the samples package supplied as part of the Android SDK. The creation of a gestures file involves launching the Gesture Builder application, either on a physical device or emulator, and “drawing” the gestures that will need to be detected by the application. Once the gestures have been designed, the file containing the gesture data can be pulled off the SD card of the device or emulator and added to the application project. Within the application code, the file is then loaded into an instance of the GestureLibrary class where it can be used to search for matches to any gestures performed by the user on the device display.

The GestureOverlayView Class

In order to facilitate the detection of gestures within an application, the Android SDK provides the GestureOverlayView class. This is a transparent view that can be placed over other views in the user interface for the sole purpose of detecting gestures.


Detecting Gestures

Gestures are detected by loading the gestures file created using the Gesture Builder app and then registering a GesturePerformedListener event listener on an instance of the GestureOverlayView class. The enclosing class is then declared to implement both the OnGesturePerformedListener interface and the corresponding onGesturePerformed callback method required by that interface. In the event that a gesture is detected by the listener, a call to the onGesturePerformed callback method is triggered by the Android runtime system.

Identifying Specific Gestures

When a gesture is detected, the onGesturePerformed callback method is called and passed as arguments a reference to the GestureOverlayView object on which the gesture was detected, together with a Gesture object containing information about the gesture.

With access to the Gesture object, the GestureLibrary can then be used to compare the detected gesture to those contained in the gestures file previously loaded into the application. The GestureLibrary reports the probability that the gesture performed by the user matches an entry in the gestures file by calculating a prediction score for each gesture. A prediction score of 1.0 or greater is generally accepted to be a good match between a gesture stored in the file and that performed by the user on the device display.

Adding SD Card Support to an AVD

Before the Gesture Builder app can be used to design gestures on an emulator, the virtual device needs to be configured with a virtual SD Card onto which the gestures file will be saved (this is not necessary when using a physical Android device).

For the purposes of this example, the AVD created in the chapter entitled Creating an Android Virtual Device (AVD) in Android Studio will be modified to add SD Card support using the AVD Manager. This can be launched from within an existing project main window via the Tools -> Android -> AVD Manager menu option or from the welcome screen by selecting Configure –> AVD Manager from the Quick Start menu. Within the resulting Android Virtual Device Manager screen, select an AVD (in this case the Nexus7 emulator) and click on the Edit… button to display the editing dialog as illustrated in Figure 21-1.

Within the configuration dialog, enter a value of 10 MiB into the Size field of the SD Card section of the dialog, and then click on the OK button to commit the changes. The AVD now has virtual SD Card storage available to store the gestures file.


Adding SD Card support to an avd configuration in Android Studio

Figure 21-1


Building and Running the Gesture Builder Application

The Gesture Builder application is bundled by default with the AVD emulator profile for most versions of the SDK. It is not, however, pre-installed on most physical Android devices. If the utility is pre-installed, it will be listed along with the other apps installed in the device or AVD instance. In the event that it is not installed, the source code for the utility is included amongst the standard Android SDK samples and consequently may be imported as an Android Studio project and compiled and run on any Android device or emulator.

To install and build the GestureBuilder utility, begin by installing the SDK samples. To do this, open the Android SDK Manager by selecting the Tools -> Android -> SDK Manager menu bar option from the project main window.

Once the manager has loaded, locate the Samples for SDK package located beneath the section for the Android version for which you are currently developing (for example Android 4.4.2 (API 19)). If the package is not already installed, set the checkbox next to the package and click on the Install 1 Package button to initiate the installation.

Once the installation is complete, the SDK samples will be installed in the following directory (where <path_to_installation> represents the location on your system where Android Studio was originally installed):

<path_to_installation>/sdk/samples/android-19

The source code for the Gesture Builder application is located within this directory in a folder named GestureBuilder within the legacy sub-folder.

From the Android Studio main window for an existing project (it is not possible to import a project from the Android Studio welcome screen), select the File -> Import Project… menu option and, within the resulting dialog, navigate to and select the GestureBuilder folder within the SDK samples directory and click on OK. Confirm the destination directory and click on Next followed by Finish to accept the default settings on the final screen. At this point, Android Studio will import the project into the designated folder and convert it to match the Android Studio project file and build structure.

Once imported, install and run the GestureBuilder utility on an Android device attached to the development system.

Creating a Gestures File

Once the Gesture Builder application has loaded, it should indicate that no gestures have yet been created. To create a new gesture, click on the Add gesture button located at the bottom of the device screen, enter the name Circle Gesture into the Name text box and then “draw” a gesture using a circular motion on the screen as illustrated in Figure 21-2. Assuming that the gesture appears as required (represented by the yellow line on the device screen), click on the Done button to add the gesture to the gestures file:


Drawing a gesture in the Android Gesture Builder app

Figure 21-2


After the gesture has been saved, the Gesture Builder app will display a list of currently defined gestures, which, at this point, will consist solely of the new Circle Gesture.

Repeat the gesture creation process to add a further gesture to the file. This should involve a two-stroke gesture creating an X on the screen named X Gesture. When creating gestures involving multiple strokes, be sure to allow as little time as possible between each stroke so that the builder knows that the strokes are part of the same gesture. Once this gesture has been added, the list within the Gesture Builder application should resemble that outlined in Figure 21-3:


[Image:android_studio_gesturesbuilder.png|The list of gestures in the Android Gesture Builder app]]

Figure 21-3


Extracting the Gestures File from the SD Card

As each gesture was created within the Gesture Builder application, it was added to a file named gestures located on the SD Card of the emulator or device on which the app was running. Before this file can be added to an Android Studio project, however, it must first be pulled off the SD Card and saved to the local file system. This is most easily achieved by using the adb command-line tool. Open a Terminal or Command Prompt window, therefore, and execute the following command:

adb devices

In the event that the adb command is not found, refer to Setting up an Android Studio Development Environment for guidance on adding this to the PATH environment variable of your system.

Once executed, the command will list all active physical devices and AVD instances attached to the system. The following output, for example, indicates that both a physical device and one AVD emulator have been detected on the development computer system:

List of devices attached
emulator-5554   device
74CE000600000001        device

In order to pull the gestures file from the emulator in the above example and place it into the current working directory of the Terminal or Command Prompt window, the following command would need to be executed:

adb -s emulator-5554 pull /sdcard/gestures .

Alternatively, the gestures file can be pulled from a device connected via adb using the following command (where the –d flag is used to indicate a physical device):

adb -d pull /sdcard/gestures .

Once the gestures file has been created and pulled off the SD Card, it is ready to be added to an Android Studio project as a resource file. The next step, therefore, is to create a new project.

Creating the Example Project

Launch Android Studio and select the appropriate option to create a new project. Within the New Project dialog, enter CustomGestures as both the application and module names. Declare a suitable package name, or use com.example.customgestures.customgestures as a placeholder name.

Set the minimum SDK menu to API 8: Android 2.2 (Froyo) and the Target SDK and Compile with menus to the latest SDK available (this tutorial assumes Android 4.4 Kit Kat) and verify that the Create activity option is selected.

On the subsequent screens, select the Blank Activity option and name the activity CustomGesturesActivity with a corresponding layout named activity_custom_gestures.

Click on the Finish button to initiate the project creation process.

Adding the Gestures File to the Project

Within the Android Studio Project tool window, locate and right-click on the res folder (located under CustomGestures -> src -> main) and select New -> Directory from the resulting menu. In the New Directory dialog, enter raw as the folder name and click on the OK button. Using the appropriate file explorer utility for your operating system type, locate the gestures file previously pulled from the SD Card and drag and drop it into the new raw folder in the Project tool window.

When the Move dialog appears click on the OK button to move the file into the project.

Designing the User Interface

This example application calls for a very simple user interface consisting of a LinearLayout view with a GestureOverlayView layered on top of it to intercept any gestures performed by the user. Locate the CustomGestures -> src -> main -> res -> layout -> activity_custom_gestures.xml file and double click on it to load it into the Designer tool.

By default, Android Studio has provided a RelativeLayout component as the root element of the user interface layout so this will need to be deleted and replaced with a LinearLayout. Locate the RelativeLayout instance in the Component Tree, right click on it and select the Delete option from the popup menu (Figure 21-4).


Deleting a view from the Android Studio Component Tree panel

Figure 21-4


Once the RelativeLayout manager has been removed from the layout, click on the LinearLayout (Vertical) item from the Layouts section of the Designer Palette and drag and drop it onto the device screen layout. Switch the Designer tool to Text mode using the Text tab along the bottom edge of the panel and verify that the XML for the layout matches that in the following listing:

<LinearLayout
    android:orientation="vertical"
    android:layout_width="fill_parent"
    android:layout_height="fill_parent"
    xmlns:android="http://schemas.android.com/apk/res/android">

</LinearLayout>

Return to Design mode, locate the Expert section of the Palette and drag and drop a GestureOverlayView object onto the layout canvas. Select the GestureOverlayView instance in the layout and use the Properties panel or Designer toolbar buttons to change the layout:width and layout:height properties to match_parent so that the view fills the available space.

Double click on the GestureOverlayView instance and use the popup property panel to change the ID to @+id/gOverlay. When completed, the activity_custom_gestures.xml file should read as follows:

<LinearLayout
    android:orientation="vertical"
    android:layout_width="fill_parent"
    android:layout_height="fill_parent"
    xmlns:android="http://schemas.android.com/apk/res/android"
    android:weightSum="1">

    <android.gesture.GestureOverlayView
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        android:id="@+id/gOverlay"
        android:layout_gravity="center_horizontal">
        </android.gesture.GestureOverlayView>
</LinearLayout>

Loading the Gestures File

Now that the gestures file has been added to the project, the next step is to write some code so that the file is loaded when the activity starts up. For the purposes of this project, the code to achieve this will be placed in the onCreate() method of the CustomGesturesActivity class located in the CustomGesturesActivity.java source file as follows:

package com.ebookfrenzy.customgestures.customgestures;

import android.support.v7.app.ActionBarActivity;
import android.os.Bundle;
import android.view.Menu;
import android.view.MenuItem;
import android.gesture.GestureLibraries;
import android.gesture.GestureLibrary;
import android.gesture.GestureOverlayView;
import android.gesture.GestureOverlayView.OnGesturePerformedListener;

public class CustomGesturesActivity extends ActionBarActivity implements OnGesturePerformedListener {

    private GestureLibrary gLibrary;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_custom_gestures);

        gLibrary =
                GestureLibraries.fromRawResource(this, R.raw.gestures);
        if (!gLibrary.load()) {
            finish();
        }
    }
.
.
.
}

In addition to some necessary import directives, the above code changes to the onCreate() method to create a GestureLibrary instance named gLibrary and then loads into it the contents of the gestures file located in the raw resources folder. The activity class has also been modified to implement the OnGesturePerformedListener interface, which requires the implementation of the onGesturePerformed callback method (which will be created in a later section of this chapter).

Registering the Event Listener

In order for the activity to receive notification that the user has performed a gesture on the screen, it is necessary to register the OnGesturePerformedListener event listener on the gLayout view, a reference to which can be obtained using the findViewById method as outlined in the following code fragment:

@Override
protected void onCreate(Bundle savedInstanceState) {
	super.onCreate(savedInstanceState);
	setContentView(R.layout.activity_custom_gestures);
		
	gLibrary = 
             GestureLibraries.fromRawResource(this, R.raw.gestures);
      if (!gLibrary.load()) {
            finish();
      }  
        
      GestureOverlayView gOverlay = 
                (GestureOverlayView) findViewById(R.id.gOverlay);
      gOverlay.addOnGesturePerformedListener(this); 
}

Implementing the onGesturePerformed Method

All that remains before an initial test run of the application can be performed is to implement the OnGesturePerformed callback method. This is the method which will be called when a gesture is performed on the GestureOverlayView instance:

package com.ebookfrenzy.customgestures.customgestures;

import android.support.v7.app.ActionBarActivity;
import android.os.Bundle;
import android.view.Menu;
import android.view.MenuItem;
import android.gesture.GestureLibraries;
import android.gesture.GestureLibrary;
import android.gesture.GestureOverlayView;
import android.gesture.GestureOverlayView.OnGesturePerformedListener;
import android.gesture.Prediction;
import android.widget.Toast;
import android.gesture.Gesture;
import java.util.ArrayList;

public class CustomGesturesActivity extends ActionBarActivity implements OnGesturePerformedListener {

    private GestureLibrary gLibrary;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_custom_gestures);

        gLibrary =
                GestureLibraries.fromRawResource(this, R.raw.gestures);
        if (!gLibrary.load()) {
            finish();
        }

        GestureOverlayView gOverlay =
                (GestureOverlayView) findViewById(R.id.gOverlay);
        gOverlay.addOnGesturePerformedListener(this);
    }

    public void onGesturePerformed(GestureOverlayView overlay, Gesture
            gesture) {
        ArrayList<Prediction> predictions =
                gLibrary.recognize(gesture);

        if (predictions.size() > 0 && predictions.get(0).score > 1.0) {

            String action = predictions.get(0).name;

            Toast.makeText(this, action, Toast.LENGTH_SHORT).show();
        }
    }
.
.
.
}

When a gesture on the gesture overlay view object is detected by the Android runtime, the onGesturePerformed method is called. Passed through as arguments are a reference to the GestureOverlayView object on which the gesture was detected together with an object of type Gesture. The Gesture class is designed to hold the information that defines a specific gesture (essentially a sequence of timed points on the screen depicting the path of the strokes that comprise a gesture).

The Gesture object is passed through to the recognize() method of our gLibrary instance, the purpose of which is to compare the current gesture with each gesture loaded from the gestures file. Once this task is complete, the recognize() method returns an ArrayList object containing a Prediction object for each comparison performed. The list is ranked in order from the best match (at position 0 in the array) to the worst. Contained within each prediction object is the name of the corresponding gesture from the gestures file and a prediction score indicating how closely it matches the current gesture.

The code in the above method, therefore, takes the prediction at position 0 (the closest match) makes sure it has a score of greater than 1.0 and then displays a Toast message (an Android class designed to display notification pop ups to the user) displaying the name of the matching gesture.

Testing the Application

Build and run the application on either an emulator or a physical Android device and perform the circle and swipe gestures on the display. When performed, the toast notification should appear containing the name of the gesture that was performed. Note, however, that when attempting to perform the X Gesture that the gesture is not recognized. Also, note that when a gesture is recognized, it is outlined on the display with a bright yellow line whilst gestures about which the overlay is uncertain appear as a faded yellow line. Whilst useful during development, this is probably not ideal for a real world application. Clearly, therefore, there is still some more configuration work to do.

Configuring the GestureOverlayView

By default, the GestureOverlayView is configured to display yellow lines during gestures and to recognize only single stroke gestures. Multi-stroke gestures can be detected by setting the android:gestureStrokeType property to multiple.

Similarly, the color used to draw recognized and unrecognized gestures can be defined via the android:gestureColor and android:uncertainGestureColor properties. For example, to hide the gesture lines and recognize multi-stroke gestures, modify the activity_custom_gestures.xml file in the example project as follows:

<LinearLayout
    android:orientation="vertical"
    android:layout_width="fill_parent"
    android:layout_height="fill_parent"
    xmlns:android="http://schemas.android.com/apk/res/android"
    android:weightSum="1">

    <android.gesture.GestureOverlayView
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        android:id="@+id/gOverlay"
        android:layout_gravity="center_horizontal"
        android:gestureColor="#00000000"
        android:uncertainGestureColor="#00000000"
        android:gestureStrokeType="multiple" >
    </android.gesture.GestureOverlayView>
</LinearLayout>

On re-running the application, gestures should now be invisible (since they are drawn in white on the white background of the LinearLayout view).

Intercepting Gestures

The GestureOverlayView is, as previously described, a transparent overlay that may be positioned over the top of other views. This leads to the question as to whether events intercepted by the gesture overlay should then be passed on to the underlying views when a gesture has been recognized. This is controlled via the android:eventsInterceptionEnabled property of the GestureOverlayView instance. When set to true, the gesture events are not passed to the underlying views when a gesture is recognized. This can be a particularly useful setting when gestures are being performed over a view that might be configured to scroll in response to certain gestures. Setting this property to true will avoid gestures also being interpreted as instructions to the underlying view to scroll in a particular direction.