Recording audio

Recording audio is one of the fundamental tasks that many multimedia apps perform. With Cascades, your app can record audio by using the AudioRecorder class and a few lines of code.

Permissions

Before your app can start recording audio, there are two permissions that you must add to your bar-descriptor.xml file. The first permission is the record_audio permission, which allows your app to access the device microphone to record audio. The second permission is the access_shared permission, which lets your app access files that are located in the shared areas of the device.

For more information about adding permissions to your bar-descriptor.xml file, see The bar-descriptor.xml file.

For more information about the shared data areas of the device, see Working with the file system.

Your app must declare an AudioRecorder object to record audio, and then you must set an output target for that object. The output target is the name of a file where the recording will be saved.

Flow and states

The architectural flow diagram shown below illustrates the various states of the AudioRecorder. The AudioRecorder must proceed through these states before reaching the Started state where it can record content.

Unprepared state

The AudioRecorder begins in an Unprepared state. In the Unprepared state, the AudioRecorder cannot record content because it doesn't have the necessary resources, such as exclusive access to the recorder or free space allocated in the file system, to do so.

Prepared and Started states

When the AudioRecorder acquires the resources it needs to record content, it's in a Prepared state. The Prepared state usually moves quickly into the Started state where it can record media content. Your app can go from the Unprepared state directly to the Started state and begin recording content. In this case, it may appear that the Prepared state was skipped, but it's not skipped, it's just occurring very briefly.

AudioRecorder architectural flow diagram

Setting up a recording app

The AudioRecorder class has no visual element, which means that you must create a UI to allow your app to record audio. The NowPlayingConnection class doesn't have a visual element either, but it's capable of sending data about the media to the volume overlay where it can be displayed to the user.

When using AudioRecorder and NowPlayingConnection, you're responsible for connecting all signals and slots to your UI controls. When you've added controls to your UI and connected your signals and slots to them, then you can begin to put in the code that makes everything work.

First, you must create an AudioRecorder object and use it to call any one of the many functions provided by AudioRecorder, including the following:
  • prepare(): Your app can call this function to acquire the necessary resources for recording media content, without actually recording a track. When prepare() is called, the recorder acquires the necessary resources to record the track and then emits the prepared() signal.
  • record(): Your app can call this function to begin recording your track. If your app calls this function without first calling prepare(), then the recorder will call prepare() automatically. When recording begins, the recorder emits the recording() signal.
  • pause(): Your app can call this function to pause the recording procedure. When the recording is paused, the recorder emits the paused() signal. Calling this function while the recording is already paused does nothing.
  • reset(): Your app can call this function to release any resources that are currently held by the recorder, and move the recorder into the unprepared state. A call to reset() causes the recorder to emit the mediaStateChanged() signal, which can be used to notify your app that the recorder is in the unprepared() state and is no in possession of the resources required to record.

The following code samples assume that you have a UI that contains a set of Button controls, which can be used to call the AudioRecorder functions needed to start or stop audio recording.

Here's a QML code sample that shows you how to set the path (and file name) for where the recording will be saved, and then begin recording. In this case, the recording is saved in a file called recording.m4a within the misc folder of the shared area on the device. The code sample uses signals and slots to start and stop the recording by responding to button control's clicked() signals. When the user clicks a button called btnRecord, its onClicked slot is called and the recording begins by calling the recorder.record() function. When the user clicks a button called btnStop, its onClicked slot is called and the recording is stopped by calling the recorder.reset() function.

import bb.multimedia 1.2

// ... 
    
attachedObjects: [ 
    AudioRecorder { 
        id: recorder 
        outputUrl: "file:///accounts/1000/shared/misc/recording.m4a" 
    } 
]

// ...

Button {
    id: btnRecord
    text: "Record"
    
    onClicked: { 
        recorder.record(); 
    }
}

Button {
    id: btnStop
    text: "Stop"
    
    onClicked: { 
        recorder.reset(); 
    }
}

You must call the setOutputUrl() function, and set a valid path to the location where you want the recorder to save your recording, before calling the prepare() function. This path should point to the location of a local file on the device. The setOutputUrl() function takes a QUrl as its only parameter. This parameter represents the path to the target file where the recording is saved.

The recording process can started by calling the record() function, and it can be stopped by calling the reset() function. Using button control signals and signal handlers is a good way to start and stop audio recordings. For example, when the user clicks a button called btnRecord, its btnRecordOnClick() handler is called, and audio recording begins by calling the record() function. When the user clicks a button called btnStop, its btnStopOnClick() handler is called, and audio recording is stopped by calling the reset() function.

Here is a C++ code sample that shows you how to use your app's constructor to connect each button control's clicked() signal to a signal handler to start or stop audio recording. We also show you how to verify that the signal and handler connections were made successfully. In the signal handler called btnRecordOnClick(), we set the path (and file name) for where the recording will be saved, and start the audio recording process. In the signal handler called btnStopOnClick(), we show you how to stop the audio recording.

ApplicationUI::ApplicationUI()
{
    //...
    
    bool result;
    Q_UNUSED(result);
    
    result = connect(btnRecord, SIGNAL(clicked()), 
                     this, SLOT(btnRecordOnClick()));
                  
    Q_ASSERT(result);
    
    result = connect(btnStop, SIGNAL(clicked()), 
                     this, SLOT(btnStopOnClick()));
    
    Q_ASSERT(result);
    
    // ...
}

// ...

void ApplicationUI::btnRecordOnClick()
{
    // Set the path to the location of the recording
    AudioRecorder recorder; 
    recorder.setOutputUrl(QUrl("file:///accounts/1000/shared/misc/recording.m4a")); 
    recorder.record();
}

void ApplicationUI::btnStopOnClick()
{
    // Stop the recorder
    recorder.reset();
}

// ...

For more information on supported media formats for your recordings, see BlackBerry 10 media support.

Last modified: 2014-02-25

comments powered by Disqus