Container { Camera { id: qmlCameraObj attachedObjects: [ CameraSettings { id: cameraModeSetting } ] // ... onCameraOpened: { qmlCameraObj.getSettings(cameraModeSetting) cameraModeSetting.cameraMode = CameraMode.Video qmlCameraObj.applySettings(cameraModeSetting) // Start the viewfinder // ... } // ... }
Recording video
To record a video, you need to:
The Camera libraries don't automatically play audible shutter sounds when a video is recorded. Camera apps must supply their own sound when a video recording begins and ends. Although you can choose to not have a shutter sound when recording a video, you are responsible for ensuring that your app adheres to the local laws of the regions in which you distribute the app. For example, it is illegal to mute or change the shutter sound of a camera app in Japan or South Korea. For more information, see the BlackBerry World Vetting Criteria.
Set video properties
When your app records video, it can access the same camera properties as it does for taking photos. If you are developing with QML and C++ APIs, you can set these properties using the CameraSettings class. More properties are available when you use the Camera C API. For example, you can use:
- camera_config_videolight(), which configures the video lighting
- camera_set_video_property(), which sets properties such as frame rate and rotation angle
For more information on video properties in the Camera C API, see the Camera image and video API.
Although most properties are optional, a few properties are required. If you're using the QML or C++ API, you must set the camera mode to video before starting the viewfinder.
If you're using the Camera C API, you must:
- Set the viewfinder mode to one that supports video recording
- Set the minimum viewfinder properties
The code samples below demonstrate how to set these properties.
// In your app's header file CameraSettings* m_pCameraModeSetting; // In your app's .cpp file m_pCameraModeSetting = new CameraSettings(); // ... void ApplicationUI::onCameraOpened () { // Set video mode for the Camera. m_pCamera->getSettings(m_pCameraModeSetting); m_pCameraModeSetting->setCameraMode(CameraMode::Video); m_pCamera->applySettings(m_pCameraModeSetting); // Start the viewfinder // ... }
static const char vf_group[] = "viewfinder_window_group"; static const char vf_id[] = "my_viewfinder"; /* Set the viewfinder mode */ if (camera_set_vf_mode(handle, CAMERA_VFMODE_VIDEO) == CAMERA_EOK) { /* Set the required viewfinder properties */ error = camera_set_vf_property(handle, CAMERA_IMGPROP_WIN_GROUPID, vf_group, CAMERA_IMGPROP_WIN_ID, vf_id); if (err == CAMERA_EOK) { /* Start the viewfinder ... */ } else { /* Handle error */ }
Start and stop recording
If you're using the QML or C++ API, call startVideoCapture() to start recording and stopVideoCapture() to stop recording. You can use any preferred UI mechanism to trigger video capture. The examples below demonstrate how to use an ActionItem to start or stop recording.
If you're using the C API, call camera_start_video() to start video capture and camera_stop_video() to stop the capture.
If you are developing with QML and C++, you can still use the Camera C API to record video. For an example that uses the C API in a Cascades app, see the Best Camera sample app in GitHub.
Because video recording is a long-running operation, you should monitor for status changes. In QML or C++, use the appropriate signal mechanism. For example, define the onVideoCaptureFailed() signal handler. In C, define a status callback or use event notifications. Errors that can interrupt video recording include disk space running out, resource arbitration, screen shutting off, and so on. See Handling errors for more information.
The following code sample defines two action items, one to start video recording and one to stop recording. When "Start" is triggered, the startVideoRecording() function is invoked to play the shutter sound and start recording. When "Stop" is triggered, the stopVideoRecording() function stops the recording and plays the shutter sound for recording stop.
Page { // ... content: Container { attachedObjects: [ // ... SystemSound { id: videoStartSnd sound: SystemSound.RecordingStartEvent }, SystemSound { id: videoStopSnd sound: SystemSound.RecordingStopEvent } ] // ... } actions: [ ActionItem { id: aiRecordStart title: "Start" accessibility.name: "Start" ActionBar.placement: ActionBarPlacement.OnBar imageSource: "asset:///ic_rec_red.png" onTriggered: { startVideoRecording() } }, ActionItem { id: aiRecordStop title: "Stop" accessibility.name: "Stop" ActionBar.placement: ActionBarPlacement.OnBar imageSource: "asset:///ic_stop.png" onTriggered: { stopVideoRecording() } } // ... } // Start recording function startVideoRecording() { videoStartSnd.play() qmlCameraObj.startVideoCapture() } // Stop recording function stopVideoRecording() { qmlCameraObj.stopVideoCapture() videoStopSnd.play() } }
The following code sample uses two action items, m_pRecordStartButton and m_pRecordStopButton, to start and stop video capture. You can define these action items in your QML file. See the QML tab for an example. When an action item is triggered, the connected slot is invoked to start or stop video recording.
// In your app's header file // declare shutter sounds for starting and stopping recording SystemSound* m_pStartRecordingSound; SystemSound* m_pStopRecordingSound; // Declare action items for starting and stopping recording ActionItem* m_pRecordStartButton; ActionItem* m_pRecordStopButton; // In your app's .cpp file m_pStartRecordingSound = new SystemSound(SystemSound::RecordingStartEvent); m_pStopRecordingSound = new SystemSound(SystemSound::RecordingStopEvent); m_pRecordStartButton = root->findChild < bb::cascades::ActionItem* >("aiRecordStart"); m_pRecordStopButton = root->findChild < bb::cascades::ActionItem* >("aiRecordStop"); // In the ApplicationUI constructor // connect the slots to the triggered() signal // ... result = QObject::connect(m_pRecordStartButton, SIGNAL(triggered()), this, SLOT(onRecordStartTriggered())); Q_ASSERT(result); result = QObject::connect(m_pRecordStopButton, SIGNAL(triggered()), this, SLOT(onRecordStopTriggered())); Q_ASSERT(result); // Define the slot that starts video capture void ApplicationUI::onRecordStartTriggered () { m_pStartRecordingSound->play(); m_pCamera->startVideoCapture(); } // Define the slot that stops video capture void ApplicationUI::onRecordStopTriggered () { m_pCamera->stopVideoCapture(); m_pStopRecordingSound->play(); }
The following code sample assumes that the screen touch event (SCREEN_EVENT_MTOUCH_TOUCH) is used to trigger video recording. The code sample creates a video file and opens it for writing before starting video capture. For a code sample that uses screen events, see Handle screen events. You don't have to use screen touch events to trigger a camera operation.
/* Receive the SCREEN_EVENT_MTOUCH_TOUCH event */ /* ... */ char filename[CAMERA_ROLL_NAMELEN]; /* Open a video file on the camera roll */ err = camera_roll_open_video(handle, &video_fd, filename, sizeof(filename), CAMERA_ROLL_VIDEO_FMT_DEFAULT); if (err != CAMERA_EOK) { /* Handle error */ } /* Play the shutter sound for recording start */ soundplayer_play_sound("event_recording_start"); err = camera_start_video(handle, filename, NULL, NULL, (void*)filename); if (err != CAMERA_EOK) { fprintf(stderr, "camera_start_video() failed: %d", err); /* Delete the video file */ close(video_fd); unlink(filename); /* Play the shutter sound for recording stop */ soundplayer_play_sound("event_recording_stop"); }
When a screen touch event arrives during video capture, stop recording by calling camera_stop_video(). You must also close the video file after stopping video capture.
/* Receive the SCREEN_EVENT_MTOUCH_TOUCH event */ /* ... */ /* Stop recording */ err = camera_stop_video(handle); if(err != CAMERA_EOK) { /* Handle error */ } close(video_fd); soundplayer_play_sound("event_recording_stop");
Last modified: 2015-05-07