Tutorial: Explore the Gestures sample app

The Gestures sample app demonstrates how to process various touch gestures using BlackBerry 10 Native SDK C libraries. Some of the gestures include tap, double-tap, and pinch.

To demonstrate the gestures, the app allows the user to interact with an image in the following ways:

  • Two-finger pan: Adjusts the viewport position in the direction of the pan and prints the coordinates to stderr
  • Pinch: Adjusts the viewport size in the direction of the pinch and prints the new size to stderr
  • Tap: Prints the tap coordinates to stderr
  • Double-tap: Prints the double-tap coordinates to stderr

When a user interacts with the screen with one or more fingers, the app receives the gestures as mtouch events and handles them accordingly.

You will learn to:

  • Create a gesture callback function
  • Register the gesture callback function
  • Detect and clean up gestures
Device image showing the Gesture sample app.

Downloading the full source code

You can download the Gestures sample app from GitHub. Before we start exploring the app, import it into the Momentics IDE and try it out. To learn how, see Importing and exporting projects.

As we explore the different areas of the app, you can follow along in the code in the IDE. The app includes comments that provide additional information to help you understand what's happening in the code.


If you view the properties for the project, you can see several libraries that are required to build and run the app:

Create and register the callback function

When a gesture set receives an event, it invokes a user-defined callback function, where the app-specific behavior is defined. The app must register a callback function with the gestures library for every gesture set to be handled.

Create the callback function

The gesture sample app includes a gesture_callback() function that defines what the app does on a swipe or pinch gesture. This function must have the following signature:

gesture_callback(gesture_base_t* gesture, mtouch_event_t* event, 
        void* param, int async)

When the gestures library invokes this function, gesture contains information about the gesture, event contains information about the touch event that caused the gesture, and async identifies whether the callback was invoked from an event (0) or from a timer callback (1).

In this function, a switch statement defines the choice of actions the app takes based on the gesture received.

switch (gesture->type) {
      // ...
   case GESTURE_PINCH: {
      // ...

For a two-finger pan, the app copies information from the incoming gesture to a local gesture_tfpan_t structure and uses the information to adjust the position of the viewport on the display. In this case, the viewport is adjusted to follow the direction of the pan by determining the distance of the swipe and adding it to the current x and y position of the viewport.

gesture_tfpan_t* tfpan = (gesture_tfpan_t*)gesture;
fprintf(stderr,"Two finger pan: %d, %d", (tfpan->last_centroid.x - 
        tfpan->centroid.x), (tfpan->last_centroid.y - tfpan->centroid.y));
if (tfpan->last_centroid.x && tfpan->last_centroid.y) {
    viewport_pos[0] += (tfpan->last_centroid.x - tfpan->centroid.x) >> 1;
    viewport_pos[1] += (tfpan->last_centroid.y - tfpan->centroid.y) >> 1;

Here, the centroid structure member contains the coordinates of the midpoint between the final two touches and last_centroid contains the coordinates of the midpoint between the previous two touches.

Illustration showing the two-finger pan gesture.

The viewport position updates on the screen for every frame in the handle_events() call of the application loop in main() and takes into account this new position.

while (!shutdown) {
   /* Handle user input */
handle_events() {
   // ...

   // Re-draw the screen after a screen event
         SCREEN_PROPERTY_SOURCE_POSITION, viewport_pos);
         SCREEN_PROPERTY_SOURCE_SIZE, viewport_size);

For a pinch, in the gestures callback function, the app copies the gesture information into a local gesture_pinch_t structure and uses it to adjust the size of the viewport. The amount to adjust the size of the viewport is equal to the change in distance between the two fingers at the start of the pinch and at the end of the pinch.

gesture_pinch_t* pinch = (gesture_pinch_t*)gesture;

fprintf(stderr,"Pinch %d, %d", 
      (pinch->last_distance.x - pinch->distance.x), 
      (pinch->last_distance.y - pinch->distance.y));

int dist_x = pinch->distance.x;
int dist_y = pinch->distance.y;
int last_dist_x = pinch->last_distance.x;
int last_dist_y = pinch->last_distance.y;

The distance structure member is the distance between the current location of the fingers and last_distance is the distance between the previous location.

Illustration showing the pinch gesture.

The app calculates the relative distance between the locations to determine the net increase or decrease of the viewport's size. If either of the relative distance values are greater than zero, the app adjusts the viewport size.

int reldist = sqrt((dist_x)*(dist_x) + (dist_y)*(dist_y));
int last_reldist = sqrt((last_dist_x)*(last_dist_x) + 

if (reldist && last_reldist) {
   viewport_size[0] += last_reldist - reldist >> 1;
   viewport_size[1] += last_reldist - reldist >> 1;

If the new viewport size doesn't fit within the screen's limits, the app adjusts the viewport.

if (viewport_size[0] < MIN_VIEWPORT_SIZE) {
    viewport_size[0] = MIN_VIEWPORT_SIZE;
} else if (viewport_size[0] > MAX_VIEWPORT_SIZE) {
    viewport_size[0] = MAX_VIEWPORT_SIZE;
if (viewport_size[1] < MIN_VIEWPORT_SIZE) {
    viewport_size[1] = MIN_VIEWPORT_SIZE;
} else if (viewport_size[1] > MAX_VIEWPORT_SIZE) {
    viewport_size[1] = MAX_VIEWPORT_SIZE;

If any of the viewport size parameters are within the limits the app zooms the viewport position to the center of the image :

if (viewport_size[0] > MIN_VIEWPORT_SIZE 
      && viewport_size[1] > MIN_VIEWPORT_SIZE 
      && viewport_size[0] < MAX_VIEWPORT_SIZE 
      && viewport_size[1] < MAX_VIEWPORT_SIZE) {
   viewport_pos[0] -= (last_reldist - reldist) >> 2;
   viewport_pos[1] -= (last_reldist - reldist) >> 2;

For tap and double-tap events, the app prints the coordinates of the tap location:

   gesture_tap_t* tap = (gesture_tap_t*)gesture;
   printf(stderr,"Tap x:%d y:%d",tap->touch_coords.x,
    gesture_double_tap_t* d_tap = (gesture_double_tap_t*)gesture;
    fprintf(stderr,"Double tap first_x:%d first_y:%d second_x:%d
                    second_y:%d", d_tap->first_touch.x,
                    d_tap->first_touch.y, d_tap->second_touch.x,

Register the callback function

The app must register the callback function with the gestures library for it to be invoked when a gesture occurs. The init_gestures() function in the sample app contains the following code:

struct gestures_set * set;

set = gestures_set_alloc();

This function allocates a gesture set for the application, using the set structure.

Next, the app allocates and initializes the handling of the different types of gestures, passing in the callback function to register.

if (NULL != set) {
    tap_gesture_alloc(NULL, gesture_callback, set);
    double_tap_gesture_alloc(NULL, gesture_callback, set);
    tfpan_gesture_alloc(NULL, gesture_callback, set);
    pinch_gesture_alloc(NULL, gesture_callback, set);
} else {
    fprintf(stderr, "Failed to allocate gestures set\n");

Detect gestures and clean up

The app needs a way to trigger the gesture callback function when a touch event occurs on the screen. The app must call gestures_set_process_event() when a touch event is detected and needs to be handled. In the sample app, this function is called from handle_screen_event() that is called from handle_events() when a screen event is detected in the main app loop.

static void
handle_screen_event(bps_event_t *event)
    int screen_val, rc;

    screen_event_t screen_event = screen_event_get_event(event);
    mtouch_event_t mtouch_event;
    rc = screen_get_event_property_iv(screen_event, 
            SCREEN_PROPERTY_TYPE, &screen_val);
    if(screen_val == SCREEN_EVENT_MTOUCH_TOUCH 
            || screen_val == SCREEN_EVENT_MTOUCH_MOVE 
            || screen_val == SCREEN_EVENT_MTOUCH_RELEASE) {
        rc = screen_get_mtouch_event(screen_event, &mtouch_event, 0);
        if (rc) {
            fprintf(stderr, "Error: failed to get mtouch event\n");
        rc = gestures_set_process_event(set, &mtouch_event, NULL);

In this function, a gesture event means either a touch, move, or release event. If it is one of these events, the app calls gestures_set_process_event().

If the call to gestures_set_process_event() doesn't trigger any callbacks, the function returns 0. This case occurs when a touch, move, or release event was detected but there was no associated gesture. The following code treats this case as a pan and adjusts the viewport position accordingly:

if (!rc) {
   if (mtouch_event.contact_id == 0) {
      if(last_touch[0] && last_touch[1]) {
         fprintf(stderr,"Pan %d %d\n",
               (last_touch[0] - mtouch_event.x),
               (last_touch[1] - mtouch_event.y));
         viewport_pos[0] = (last_touch[0] - mtouch_event.x) >> 1;
         viewport_pos[1] = (last_touch[1] - mtouch_event.y) >> 1;
      last_touch[0] = mtouch_event.x;
      last_touch[1] = mtouch_event.y;

Clean up gestures

Now that the app can perform actions based on gestures, at some point it needs to clean itself up before closing. In other words, it's always a good idea to clean up any memory associated with gestures data structures. In gestures_cleanup(), the following lines of code free the memory associated with the gesture set:

if (NULL != set) {
        set = NULL;

That's it! The app can now set up the gestures library, process gesture events, and clean up its resources when it's stops running.

Last modified: 2015-03-31

Got questions about leaving a comment? Get answers from our Disqus FAQ.

comments powered by Disqus