Would you like to tell us how we are doing?

You bet No thanks

Sorry about the red box, but we really need you to update your browser. Read this excellent article if you're wondering why we are no longer supporting this browser version. Go to Browse Happy for browser suggestions and how to update.

Augmented reality

When you point the camera lens of your BlackBerry device at something, the image you see on the screen is a digital representation of that thing. You can create applications that overlay text and graphics on that digital image to augment your view of the thing in the real world.

For example, if you point the camera at a street lined with restaurants, your application could draw different numbers of dollar signs on each restaurant to indicate the relative cost of eating at each restaurant. The restaurants in the example are referred to as points of interest or POIs. One way to think of this is as a quick and convenient way of inputting search criteria and presenting the results in the context of the real world.

In the previous example, the search criteria would be restaurants on that street. Unlike a conventional search, your application automatically generates the list of restaurants of interest instead of requiring you to input them. The application uses the GPS radio on the device to determine the latitude and longitude of the camera and the magnetometer and accelerometer on the device to determine the orientation of the camera.

Using both pieces of information, your application can query a geographical information database to determine the street that is visible in the field of view of the camera and the names of the restaurants on that street. After it determines the names of the restaurants, it can make another database query to determine the average meal prices at those restaurants. Next, it can present that information graphically by drawing it on locations on the display screen that correspond to each of the visible restaurants.

There are two basic tasks that most augmented reality applications need to perform: figuring out if there is anything of interest currently visible on the camera screen; and drawing on the camera screen to augment the reality accordingly. The following sections provide more information about these two primary augmented reality tasks.

Determine if a POI is within the camera field of view

Before you begin: The BlackBerry device must have a GPS radio and a magnetometer and accelerometer.

Determining if a point of interest (POI) is currently visible in the camera viewport is a fundamental task of any augmented reality application. After determining that a POI is visible in a camera viewport, you then need to determine where on the screen the POI is located.

  1. Retrieve the latitude, longitude, and altitude of the BlackBerry device.
  2. Query an offline or online database for points of interest that are within a specified distance from the device.
  3. Convert device location coordinates to Cartesian coordinates.
    • Convert device location coordinates to spherical coordinates.
    • Convert device spherical coodinates to Cartesian coordinates.
  4. If required, convert POI location coordinates to Cartesian coordinates.
  5. Determine the orientation of the device.
  6. Construct a vector normal to the direction of the camera lens.
  7. Construct vectors with origins at each of the POIs that point towards the camera lens.
  8. Determine the angle between each POI vector and the camera normal vector.

POI vectors that are at an angle of less than the FOV angle of the camera, are within the camera field of view. (FOV is the field of view, meaning the view that is observed.)

Retrieving the location of a BlackBerry device

You can retrieve the location of a BlackBerry device by specifying a single GPS fix, or by specifying a location listener to retrieve continuous GPS fixes.

Code sample: Retrieving the GPS location of a BlackBerry device

import javax.microedition.location.*;

//Create a class and a constructor. 
public class handleGPS
{

    //Declare static fields in the class. 
    static GPSThread gpsThread;
    static double latitude;
    static double longitude;

    public handleGPS()
    {

        //In the constructor, create and start a local thread. 
        gpsThread = new GPSThread();
        gpsThread.start();
    }

    //In the class, create a private class that extends Thread, 
    //and create a run() method. 
    private static class GPSThread extends Thread
    {
        public void run()
        {

           //In the run() method, create an instance of the Criteria class.
           //Invoke setCostAllowed(false) to specify that it is in autonomous mode. 
            Criteria myCriteria = new Criteria();
            myCriteria.setCostAllowed(false);

            //In the run() method, create a try/catch block. In the block create a 
            //LocationProvider object by getting an instance of the Criteria object. 
            //Create another try/catch block to create a Location object to request the 
            //current location of the BlackBerry device and specify the timeout period 
            //in seconds. When the getLocation() method returns, request the 
            //latitude and longitude coordinates. 
            try
            {
                LocationProvider myLocationProvider =
                    LocationProvider.getInstance(myCriteria);

                try
                {
                    Location myLocation = myLocationProvider.getLocation(300);
                    latitude  = myLocation.getQualifiedCoordinates().getLatitude();
                    longitude = myLocation.getQualifiedCoordinates().getLongitude();
                }
                catch ( InterruptedException iex )
                {
                    return;
                }
                catch ( LocationException lex )
                {
                    return;
                }
            }
            catch ( LocationException lex )
            {
                return;
            }
            return;
        }
    }
}

Drawing on a live camera display

To create an augmented reality application, you need to draw on the live camera display. You can use the ComponentCanvas class to draw any type of field on the camera display. For 2-D vector graphics, you can use VGField and for 3-D graphics you can use GLField.

You will typically want the background of your overlaid fields to be transparent. VGField and GLField both support background transparency, which enables you to overlay rich 2-D and 3-D content with transparent backgrounds.

Display text on camera viewport using a LabelField

Import the required classes.

import javax.microedition.media.Player;
import javax.microedition.media.control.VideoControl;

import net.rim.device.api.system.Display;
import net.rim.device.api.ui.Field;
import net.rim.device.api.ui.UiApplication;
import net.rim.device.api.ui.component.LabelField;
import net.rim.device.api.ui.container.AbsoluteFieldManager;
import net.rim.device.api.ui.container.ComponentCanvas;
import net.rim.device.api.ui.container.MainScreen;

Create the application framework by extending the UiApplication class. In main(), create an instance of the new class and invoke enterEventDispatcher() to enable the application to receive events. In the application constructor, invoke pushScreen() to display the custom screen for the application. The OverlayScreen class, which is described in step 3, represents the custom screen.

public class extends UiApplication
{
    public static void main(String[] args)
    {
        OverlayLabel theApp = new OverlayLabel();       
        theApp.enterEventDispatcher();
    }
    
    public OverlayLabel()
    {        
        pushScreen(new OverlayScreen());
    }    
}

Create the framework for the custom screen by extending the MainScreen class. Declare class variables for the LabelField, VideoControl, Field and Player and classes that you create in the screen's constructor.

class OverlayScreen extends MainScreen
{
    private LabelField   _label;
    private VideoControl _videoControl;
    private Field        _cameraField;
    private Player       _player; 
}

In the screen constructor, create instances of AbsoluteFieldManager and ComponentCanvas. Set the width and height of the ComponentCanvas to the dimensions of the display.

class OverlayScreen extends MainScreen
{
    private LabelField   _label;
    private VideoControl _videoControl;
    private Field        _cameraField;
    private Player       _player; 
    
    
    public OverlayScreen()
    {
        AbsoluteFieldManager afm = new AbsoluteFieldManager();
        ComponentCanvas cc = new ComponentCanvas(
                        Display.getWidth(),Display.getHeight());
        

    }
}

In a try/catch block, create an instance of the Player class by invoking Manager.createPlayer(), passing in a string that indicates that the player should capture video. Invoke the Player object's realize() method before accessing the associated Control object. Next, invoke Player.getControl() to retrieve the Player object's VideoControl.

class OverlayScreen extends MainScreen
{
    private LabelField   _label;
    private VideoControl _videoControl;
    private Field        _cameraField;
    private Player       _player; 
    
    
    public OverlayScreen()
    {
        AbsoluteFieldManager afm = new AbsoluteFieldManager();
        ComponentCanvas cc = new ComponentCanvas(
                        Display.getWidth(),Display.getHeight());
        
    try
        {
            _player = javax.microedition.media.Manager.createPlayer(
                     "capture://video");
            _player.realize();
            _videoControl = (VideoControl)_player.getControl(
                            "VideoControl");          
        }
        catch(Exception e)
        {
            // Handle exceptions.
        }

    }
}

After checking that the VideoControl instance is not null, create a viewfinder in your application by invoking VideoControl.initDisplayMode() and passing in a parameter specifying the UI primitive that displays the picture to initialize the mode that a video field uses. Cast the returned object to a Field object. Invoke VideoControl.setDisplayFullScreen() passing in true to configure the viewfinder to take up the full screen of the device. Invoke VideoControl.setVisible() passing in true to display the viewfinder. Lastly, invoke start() to start the player.

class OverlayScreen extends MainScreen
{
    private LabelField   _label;
    private VideoControl _videoControl;
    private Field        _cameraField;
    private Player       _player; 
    
    
    public OverlayScreen()
    {
        AbsoluteFieldManager afm = new AbsoluteFieldManager();
        ComponentCanvas cc = new ComponentCanvas(Display.getWidth(),Display.getHeight());
        
        try
        {
            _player = javax.microedition.media.Manager.createPlayer("capture://video");
            _player.realize();
            _videoControl = (VideoControl)_player.getControl("VideoControl");

        if (_videoControl != null)
            {
                _cameraField = (Field)_videoControl.initDisplayMode(
                  VideoControl.USE_GUI_PRIMITIVE, "net.rim.device.api.ui.Field");
                _videoControl.setDisplayFullScreen(true);                
                _videoControl.setVisible(true);
            }
            _player.start();           
        }
        catch(Exception e)
        {
            // Handle exceptions.
        }
    }
}

Add the camera field to the AbsoluteFieldManager. Add the ComponentCanvas to the AbsoluteFieldManager, then add the AbsoluteFieldManager to the OverlayScreen.

class OverlayScreen extends MainScreen
{
    private LabelField   _label;
    private VideoControl _videoControl;
    private Field        _cameraField;
    private Player       _player; 
    
    
    public OverlayScreen()
    {
        AbsoluteFieldManager afm = new AbsoluteFieldManager();
        ComponentCanvas cc = new ComponentCanvas(Display.getWidth(),Display.getHeight());
        
        try
        {
            _player = javax.microedition.media.Manager.createPlayer("capture://video");
            _player.realize();
            _videoControl = (VideoControl)_player.getControl("VideoControl");

            if (_videoControl != null)
            {
                _cameraField = (Field)_videoControl.initDisplayMode(VideoControl.USE_GUI_PRIMITIVE, "net.rim.device.api.ui.Field");
                _videoControl.setDisplayFullScreen(true);                
                _videoControl.setVisible(true);
            }
            _player.start();           
        }
        catch(Exception e)
        {
            // Handle exceptions.
        }
        
    afm.add(_cameraField);
        afm.add(cc);
        add(afm);
        
    }
}

Create an instance of LabelField, specifying the text you want to overlay on the camera viewport. Add the LabelField to the ComponentCanvas.

class OverlayScreen extends MainScreen
{
    private LabelField   _label;
    private VideoControl _videoControl;
    private Field        _cameraField;
    private Player       _player; 
    
    
    public OverlayScreen()
    {
        AbsoluteFieldManager afm = new AbsoluteFieldManager();
        ComponentCanvas cc = new ComponentCanvas(Display.getWidth(),Display.getHeight());
        
        try
        {
            _player = javax.microedition.media.Manager.createPlayer("capture://video");
            _player.realize();
            _videoControl = (VideoControl)_player.getControl("VideoControl");

            if (_videoControl != null)
            {
                _cameraField = (Field)_videoControl.initDisplayMode(VideoControl.USE_GUI_PRIMITIVE, "net.rim.device.api.ui.Field");
                _videoControl.setDisplayFullScreen(true);                
                _videoControl.setVisible(true);
            }
            _player.start();           
        }
        catch(Exception e)
        {
            // Handle exceptions.
        }
        
        afm.add(_cameraField);
        afm.add(cc);
        add(afm);
        
        _label = new LabelField("Overlaid text");
        cc.add(_label,100,100);
        
        
    }
}

Display text on camera viewport using a VGField

Import the required classes.

import javax.microedition.media.Player;
import javax.microedition.media.control.VideoControl;

import net.rim.device.api.system.Display;
import net.rim.device.api.ui.Field;
import net.rim.device.api.ui.UiApplication;
import net.rim.device.api.ui.component.LabelField;
import net.rim.device.api.ui.container.AbsoluteFieldManager;
import net.rim.device.api.ui.container.ComponentCanvas;
import net.rim.device.api.ui.container.MainScreen;

Create the application framework by extending the UiApplication class. In main(), create an instance of the new class and invoke enterEventDispatcher() to enable the application to receive events. In the application constructor, invoke pushScreen() to display the custom screen for the application. The OverlayScreen class represents the custom screen.

public class extends UiApplication
{
    public static void main(String[] args)
    {
        OverlayLabel theApp = new OverlayLabel();       
        theApp.enterEventDispatcher();
    }
    
    public OverlayLabel()
    {        
        pushScreen(new OverlayScreen());
    }    
}

Create the custom screen class by extending the MainScreen class. Declare class variables for the LabelField, VideoControl, Field and Player and classes that you create in the screen's constructor.

class OverlayScreen extends MainScreen
{
    private LabelField   _label;
    private VideoControl _videoControl;
    private Field        _cameraField;
    private Player       _player; 
}

In the screen constructor, create instances of AbsoluteFieldManager and ComponentCanvas. Set the width and height of the ComponentCanvas to the dimensions of the display.

class OverlayScreen extends MainScreen
{
    private LabelField   _label;
    private VideoControl _videoControl;
    private Field        _cameraField;
    private Player       _player; 
    
    
    public OverlayScreen()
    {
        AbsoluteFieldManager afm = new AbsoluteFieldManager();
        ComponentCanvas cc = new ComponentCanvas(
                        Display.getWidth(),Display.getHeight());
        

    }
}

In a try/catch block, create an instance of the Player class by invoking Manager.createPlayer(), passing in a string that indicates that the player should capture video. Invoke the Player object's realize() method before accessing the associated Control object. Next, invoke Player.getControl() to retrieve the Player object's VideoControl.

class OverlayScreen extends MainScreen
{
    private LabelField   _label;
    private VideoControl _videoControl;
    private Field        _cameraField;
    private Player       _player; 
    
    
    public OverlayScreen()
    {
        AbsoluteFieldManager afm = new AbsoluteFieldManager();
        ComponentCanvas cc = new ComponentCanvas(
                        Display.getWidth(),Display.getHeight());
        
    try
        {
            _player = javax.microedition.media.Manager.createPlayer(
                     "capture://video");
            _player.realize();
            _videoControl = (VideoControl)_player.getControl(
                            "VideoControl");          
        }
        catch(Exception e)
        {
            // Handle exceptions.
        }

    }
}

After checking that the VideoControl instance is not null, create a viewfinder in your application by invoking VideoControl.initDisplayMode() and passing in a parameter specifying the UI primitive that displays the picture to initialize the mode that a video field uses. Cast the returned object to a Field object. Invoke VideoControl.setDisplayFullScreen() passing in true to configure the viewfinder to take up the full screen of the device. Invoke VideoControl.setVisible() passing in true to display the viewfinder. Lastly, invoke start() to start the player.

class OverlayScreen extends MainScreen
{
    private LabelField   _label;
    private VideoControl _videoControl;
    private Field        _cameraField;
    private Player       _player; 
    
    
    public OverlayScreen()
    {
        AbsoluteFieldManager afm = new AbsoluteFieldManager();
        ComponentCanvas cc = new ComponentCanvas(Display.getWidth(),Display.getHeight());
        
        try
        {
            _player = javax.microedition.media.Manager.createPlayer("capture://video");
            _player.realize();
            _videoControl = (VideoControl)_player.getControl("VideoControl");

        if (_videoControl != null)
            {
                _cameraField = (Field)_videoControl.initDisplayMode(
                  VideoControl.USE_GUI_PRIMITIVE, "net.rim.device.api.ui.Field");
                _videoControl.setDisplayFullScreen(true);                
                _videoControl.setVisible(true);
            }
            _player.start();           
        }
        catch(Exception e)
        {
            // Handle exceptions.
        }
    }
}

Add the camera field to the AbsoluteFieldManager. Add the ComponentCanvas to the AbsoluteFieldManager, then add the AbsoluteFieldManager to the OverlayScreen.

class OverlayScreen extends MainScreen
{
    private LabelField   _label;
    private VideoControl _videoControl;
    private Field        _cameraField;
    private Player       _player; 
    
    
    public OverlayScreen()
    {
        AbsoluteFieldManager afm = new AbsoluteFieldManager();
        ComponentCanvas cc = new ComponentCanvas(Display.getWidth(),Display.getHeight());
        
        try
        {
            _player = javax.microedition.media.Manager.createPlayer("capture://video");
            _player.realize();
            _videoControl = (VideoControl)_player.getControl("VideoControl");

            if (_videoControl != null)
            {
                _cameraField = (Field)_videoControl.initDisplayMode(VideoControl.USE_GUI_PRIMITIVE, "net.rim.device.api.ui.Field");
                _videoControl.setDisplayFullScreen(true);                
                _videoControl.setVisible(true);
            }
            _player.start();           
        }
        catch(Exception e)
        {
            // Handle exceptions.
        }
        
    afm.add(_cameraField);
        afm.add(cc);
        add(afm);
        
    }
}

Create an instance of VGTextField, specifying the text you want to overlay on the camera viewport. Add the VGTextField to the ComponentCanvas.

class OverlayScreen extends MainScreen
{
    private VGTextField   _label;
    private VideoControl _videoControl;
    private Field        _cameraField;
    private Player       _player; 
    
    
    public OverlayScreen()
    {
        AbsoluteFieldManager afm = new AbsoluteFieldManager();
        ComponentCanvas cc = new ComponentCanvas(Display.getWidth(),Display.getHeight());
        
        try
        {
            _player = javax.microedition.media.Manager.createPlayer("capture://video");
            _player.realize();
            _videoControl = (VideoControl)_player.getControl("VideoControl");

            if (_videoControl != null)
            {
                _cameraField = (Field)_videoControl.initDisplayMode(VideoControl.USE_GUI_PRIMITIVE, "net.rim.device.api.ui.Field");
                _videoControl.setDisplayFullScreen(true);                
                _videoControl.setVisible(true);
            }
            _player.start();           
        }
        catch(Exception e)
        {
            // Handle exceptions.
        }
        
        afm.add(_cameraField);
        afm.add(cc);
        add(afm);
        
        _label = new VGTextField("Overlaid text");
        cc.add(_label,100,100);
        
        
    }
}

Create the VGTextField class by extending the VGField class. Declare member variables.

public class VGTextField extends VGField
 {
     private static final float[] MY_CLEAR_COLOR = 
        new float[] { 1.0f, 1.0f, 1.0f, 0.0f };

     private int _image;
     private int _width;
     private int _height;    
 }

In the constructor of the VGTextField class, call the constructor of the super class, passing in the version of OpenVG to use, as well as a hint to disable surface synchronization. Set the _width and _height member variable values to the corresponding values passed in as parameters.

public class VGTextField extends VGField
 {
     private static final float[] MY_CLEAR_COLOR = 
             new float[] { 1.0f, 1.0f, 1.0f, 0.0f };
     private Bitmap _bm;
     private int _image;
     private int _width;
     private int _height;
     
     
VGHeadingField(int width, int height)
    {        
        super(VGField.VERSION_1_1,VGField.DISABLE_SURFACE_SYNC_HINT);
        
        _width = width;
        _height = height;   
    }
 }

Override layout() to set the extent of the field to the size of the display.

public class VGTextField extends VGField
 {
     private static final float[] MY_CLEAR_COLOR = new float[] { 1.0f, 1.0f, 1.0f, 0.0f };
     private Bitmap _bm;
     private int _image;
     private int _width;
     private int _height;
     
     
    VGHeadingField(int width, int height)
    {        
        super(VGField.VERSION_1_1,VGField.DISABLE_SURFACE_SYNC_HINT);
        
        _width = width;
        _height = height;   
    }
    
    protected void layout(int width, int height)
    {        
        setExtent(_width, _height);
    }
 }

Override initialize() and cast the passed in vg interface to vg11. Call vgSetfv() to set the background color of the field.

public class VGTextField extends VGField
 {
     private static final float[] MY_CLEAR_COLOR = 
        new float[] { 1.0f, 1.0f, 1.0f, 0.0f };
     private int _width;
     private int _height;
     
     
    VGHeadingField(int width, int height)
    {        
        super(VGField.VERSION_1_1,VGField.DISABLE_SURFACE_SYNC_HINT);
        
        _width = width;
        _height = height;   
    }
    
    protected void layout(int width, int height)
    {        
        setExtent(_width, _height);
    }
     
protected void initialize(VG vg)
    {
         VG11 vg11 = (VG11)vg;
         vg11.vgSetfv(VG10.VG_CLEAR_COLOR, 4, MY_CLEAR_COLOR, 0);
    }
 }

Override render() to create and display the text using OpenVG. Create an instance of FontSpec. Create an instance of FontFamily, specifying the Andale Mono font. Use setFontFamily() to configure the FontSpec to use Andale Mono. Set the height of the font to 100 pt. Call VGUtils.vgCreateTextAsPath() to construct a vgPath that represents the text "test". Create an instance of vgPaint. Set its parameter type using vgSetParameteri(). Configure vgPaint to represent a filled path. Draw the path created using vgCreateTextAsPath() by calling vgDrawPath(). Call vgDestroyPaint() to clean up.

public class VGTextField extends VGField
 {
     private static final float[] MY_CLEAR_COLOR = 
        new float[] { 1.0f, 1.0f, 1.0f, 0.0f };
     private int _width;
     private int _height;
     
     
    VGHeadingField(int width, int height)
    {        
        super(VGField.VERSION_1_1,VGField.DISABLE_SURFACE_SYNC_HINT);
        
        _width = width;
        _height = height;   
    }
    
    protected void layout(int width, int height)
    {        
        setExtent(_width, _height);
    }
     
     protected void initialize(VG vg)
     {
         VG11 vg11 = (VG11)vg;
         vg11.vgSetfv(VG10.VG_CLEAR_COLOR, 4, MY_CLEAR_COLOR, 0);
     }
     
     protected void render(VG vg)
     {
         VG11 vg11 = (VG11)vg;
         vg11.vgClear(0, 0, this.getWidth(), this.getHeight());
         try
         {
             FontSpec fs = new FontSpec();
             FontFamily ff = FontFamily.forName("Andale Mono");

             fs.setFontFamily(ff);

             fs.setHeight(100);
             int vgTextPath = VGUtils.vgCreateTextAsPath(
                vg11,fs,"Test",null,null);
         
             int vgPaint = vg11.vgCreatePaint();
             vg11.vgSetParameteri(vgPaint, VG11.VG_PAINT_TYPE, 
                                           VG11.VG_PAINT_TYPE_COLOR);
             vg11.vgSetPaint(vgPaint,VG11.VG_FILL_PATH);
             
             vg11.vgDrawPath(vgTextPath,VG11.VG_FILL_PATH);
             vg11.vgDestroyPaint(vgPaint);
         }
         catch(Exception e)
         {
        	 // Handle exceptions.
         }
     } 
 }

Override getPreferredColorBufferSize() to return 32. You must do this to indicate that the field needs a 32-bit color buffer to handle transparency.

public class VGTextField extends VGField
 {
     private static final float[] MY_CLEAR_COLOR = 
        new float[] { 1.0f, 1.0f, 1.0f, 0.0f };

     private int _width;
     private int _height;
     
     
    VGHeadingField(int width, int height)
    {        
        super(VGField.VERSION_1_1,VGField.DISABLE_SURFACE_SYNC_HINT);
        
        _width = width;
        _height = height;   
    }
    
    protected void layout(int width, int height)
    {        
        setExtent(_width, _height);
    }
     
     protected void initialize(VG vg)
     {
         VG11 vg11 = (VG11)vg;
         vg11.vgSetfv(VG10.VG_CLEAR_COLOR, 4, MY_CLEAR_COLOR, 0);
     }
     
     protected void render(VG vg)
     {
         VG11 vg11 = (VG11)vg;
         vg11.vgClear(0, 0, this.getWidth(), this.getHeight());
         try
         {
         FontSpec fs = new FontSpec();
         FontFamily ff = FontFamily.forName("Andale Mono");

         fs.setFontFamily(ff);

         fs.setHeight(100);
         int vgTextPath = VGUtils.vgCreateTextAsPath(vg11,fs,"Test",null,null);
         
         int vgPaint = vg11.vgCreatePaint();
         vg11.vgSetParameteri(vgPaint, VG11.VG_PAINT_TYPE, VG11.VG_PAINT_TYPE_COLOR);
         vg11.vgSetPaint(vgPaint,VG11.VG_FILL_PATH);
         vg11.vgDestroyPaint(vgPaint);

         
         vg11.vgDrawPath(vgTextPath,VG11.VG_FILL_PATH);
         }
         catch(Exception e)
         {
        	 // Handle exceptions.
         }
     }
     
protected int getPreferredColorBufferSize()
    {
        return 32;
    }      

 }

Code sample: Displaying a VGField on the camera viewport

public class DemoVGField extends VGField
 {
     private static final float[] MY_CLEAR_COLOR = new float[] { 0.6f, 0.8f, 1.0f, 0.3f };
     private Bitmap _bm;
     private int _image;
     private int _width;
     private int _height;
     
     
    DemoVGField(int width, int height)
    {        
        super(VGField.VERSION_1_1,VGField.DISABLE_SURFACE_SYNC_HINT);
        
        _width = width;
        _height = height;   
    }
    
    protected void layout(int width, int height)
    {        
        setExtent(_width, _height);
    }
    
     protected DemoVGField(int version)
     {
         super(version);
     }
     
     protected void initialize(VG vg)
     {
         VG11 vg11 = (VG11)vg;
         vg11.vgSetfv(VG10.VG_CLEAR_COLOR, 4, MY_CLEAR_COLOR, 0);
         _bm = EncodedImage.getEncodedImageResource("BlackBerry.png").getBitmap();
         _image = VGUtils.vgCreateImage(vg11, _bm, true, VG11.VG_IMAGE_QUALITY_BETTER);
     }
     
     protected void render(VG vg)
     {
         VG11 vg11 = (VG11)vg;
         vg11.vgClear(0, 0, this.getWidth(), this.getHeight());
         vg11.vgDrawImage(_image);
     }
     
    protected int getPreferredColorBufferSize()
    {
        return 32;
    }      

 }

Code sample: Displaying a PNG file on a camera viewport

This sample requires a .png file named POIMarker.png. You can either create a .png file with that name or change the file name in the source code. You can make the background of the .png file transparent.

import net.rim.device.api.ui.*;
import net.rim.device.api.system.*;
import net.rim.device.api.ui.component.*;
import net.rim.device.api.ui.container.*;

import javax.microedition.media.Player;
import javax.microedition.media.control.VideoControl;

public final class OverlayPNG extends UiApplication 
{

    public static void main(String[] args)
    {
        OverlayPNG app = new OverlayPNG();
        app.enterEventDispatcher();        
    }

    public OverlayPNG()
    {
        pushScreen(new OverlayScreen());       
    }    
}


class OverlayScreen extends MainScreen
{
    private BitmapField overlayImage;
    private VideoControl _videoControl;    
    private Field _cameraField;        
    private Player _player;  
    public OverlayScreen()
    {
        AbsoluteFieldManager afm = new AbsoluteFieldManager();
        
        ComponentCanvas compCanvas = new ComponentCanvas(Display.getWidth(),Display.getHeight());
        overlayImage = new BitmapField(Bitmap.getBitmapResource("POIMarker.png"));

        try
        {
            _player = javax.microedition.media.Manager.createPlayer("capture://video");
            _player.realize();
            _videoControl = (VideoControl)_player.getControl("VideoControl");

            if (_videoControl != null)
            {
                _cameraField = (Field)_videoControl.initDisplayMode(VideoControl.USE_GUI_PRIMITIVE, "net.rim.device.api.ui.Field");
                _videoControl.setDisplayFullScreen(true);                
                _videoControl.setVisible(true);
            }
            _player.start();           
        }
        catch(Exception e)
        {
            //Handle exceptions.
        }
        
        afm.add(_cameraField);
        afm.add(compCanvas);
        compCanvas.add(overlayImage,100,100);
        add(afm);
        
        
    }
}

Code sample: Displaying a heading on a camera viewport

package mypackage;

import javax.microedition.media.Player;
import javax.microedition.media.control.VideoControl;

import net.rim.device.api.system.Display;
import net.rim.device.api.system.MagnetometerData;
import net.rim.device.api.system.MagnetometerListener;
import net.rim.device.api.system.MagnetometerSensor;
import net.rim.device.api.system.MagnetometerSensor.Channel;
import net.rim.device.api.ui.Field;
import net.rim.device.api.ui.Font;
import net.rim.device.api.ui.FontFamily;
import net.rim.device.api.ui.UiApplication;
import net.rim.device.api.ui.component.LabelField;
import net.rim.device.api.ui.container.AbsoluteFieldManager;
import net.rim.device.api.ui.container.ComponentCanvas;
import net.rim.device.api.ui.container.MainScreen;


public class ARHeadingApp extends UiApplication implements MagnetometerListener
{
	private HeadingScreen _screen;
	
    public static void main(String[] args)
    {
        ARHeadingApp theApp = new ARHeadingApp();       
        theApp.enterEventDispatcher();
    }
    
    public ARHeadingApp()
    {        
        _screen = new HeadingScreen();
        pushScreen(_screen);    
        Channel magChannel = MagnetometerSensor.openChannel(this); 
        magChannel.addMagnetometerListener(this); 
    } 
    
	public void onData(MagnetometerData magData) 
	{ 
		_screen.displayData(magData); 
	}
}

class HeadingScreen extends MainScreen
{
	private Player       _player;
	private VideoControl _videoControl;
	private Field        _cameraField;
	private LabelField   _headingLabel;
	 
    public HeadingScreen()
    {            
        setTitle("Augmented Reality Heading Sample");
        initializeCamera();
        AbsoluteFieldManager afm = new AbsoluteFieldManager();
        ComponentCanvas cc = new ComponentCanvas(Display.getWidth(),Display.getHeight());
        afm.add(_cameraField,0,0);
        afm.add(cc);
        add(afm);
        
        _headingLabel = new LabelField("?");
        Font myFont;
        try
        {
	        FontFamily ff = FontFamily.forName("BBCAPITALS");
	        myFont = ff.getFont(FontFamily.SCALABLE_FONT, 42).derive(Font.BOLD);
	        cc.add(_headingLabel,Display.getWidth()/2,30);
	        cc.setFont(myFont);
        }
        catch(Exception e)
        {
        //Handle exceptions.
        }
    }

	private void initializeCamera()
	{
	    try
	    {
	        _player = javax.microedition.media.Manager.createPlayer("capture://video");
	        _player.realize();
	
	        _videoControl = (VideoControl)_player.getControl("VideoControl");
	
	        if (_videoControl != null)
	        {
	            _cameraField = (Field)_videoControl.initDisplayMode(VideoControl.USE_GUI_PRIMITIVE, "net.rim.device.api.ui.Field");
	            _videoControl.setDisplayFullScreen(true);                
	            _videoControl.setVisible(true);
	        }
	        _player.start();           
	    }
	    catch(Exception e)
	    {
	        //Handle exceptions.
	    }
	}


	void displayData(MagnetometerData magData) 
	{ 
		_headingLabel.setText(Float.toString(magData.getDirectionBack()));
	}
}