Sample-webrtc-android

Ask tech team
From QuickBlox Developers (API docs, code samples, SDK)
Jump to: navigation, search


Contents

Sources

Project homepage on GIT — https://github.com/QuickBlox/quickblox-android-sdk/tree/master/sample-videochat-webrtc

Download ZIP - https://github.com/QuickBlox/quickblox-android-sdk/archive/master.zip


Overview

The WebRTC VideoChat code sample allows you easily add video calling features into your Android app. Enable a video call function similar to Skype using this code sample as a basis.

It is built on the top of WebRTC technology.



LoginVideo.png Users.png Group.png P2p.png



System requirements

  • Quickblox Android video chat webrtc sdk supports:
    • armeabi, armeabi-v7a, armeabi64-v8a, and x86 architectures.
    • Android 4.3+ (Jelly Bean MR2, API Level 18). SDK is expected to work with Android 4.1+ (Jelly Bean, API Level 16), but on such devices can be problems with video quality.
    • devices from Samsung, Google, Motorola Moto, and LG Optimus families. And other official android devices like Nexus family.
    • Wi-Fi and 4G LTE networks.

Prepare your application for Android SDK

Preparation includes next steps:

  • Create QuickBlox account
  • Register an application in Dashboard
  • Integrate QuickBlox SDK into application

Get QuickBlox account

For creating your personal account refer to registration http://admin.quickblox.com/register page

Create application in Admin panel

The steps of creating application in admin panel are represented in http://admin.quickblox.com/apps/new page.

Also you can look through 5 min guide.

Integrate QuickBlox sdk in your application

To use video chat based on WEBRTC technology in your app, you must add dependency on next three jar-files, it is:

  • core;
  • chat
  • videochat-webrtc

To get information about existing QuickBlox SDK module and how to connect them, please, refer to the Add SDK to IDE and connect to the cloud page.

To embed video chat in your app include video chat relevant dependencies in build.gradle project file:

dependencies {
    compile "com.quickblox:quickblox-android-sdk-core:2.5.1@aar"
    compile("com.quickblox:quickblox-android-sdk-chat:2.5.1@aar") {
        transitive=true
    }
    compile "com.quickblox:quickblox-android-sdk-videochat-webrtc:2.5.1@aar"
}

or starting from sdk 2.6.1 just add:

dependencies {
    compile "com.quickblox:quickblox-android-sdk-videochat-webrtc:2.6.1"
}

Add native libraries - libjingle_peerconnection_so.so files. Put native library for each platform: arm64-v8a, armeabi-v7a, x86, x86_64 under app/src/main/jniLibs folder. You can find native files in sample under /src/main/jniLibs folder.

Android webrtc native libraries.png


Video chat module requires camera, microphone, internet and storage permissions. Make sure you add relevant permissions to your app manifest:

  <uses-permission android:name="android.permission.CAMERA" />
  <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
  <uses-permission android:name="android.permission.RECORD_AUDIO" />
  <uses-permission android:name="android.permission.INTERNET" />
  <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
  <uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />

Detailed information about app permission here Working with System Permissions

Pay attention. Beginning in Android 6.0 (API level 23), users grant permissions to apps while the app is running, not when they install the app.

You can grant permissions to your app via device system settings or request them at runtime from your code.

  • To manually grant permissions open the "Settings" menu and tap “Apps”. Choose your app and tap on “Permissions”. Enable necessary permissions - camera, microphone, internet and storage.

Integrate video calls to your application

Transition guide from 2.3 to 2.4 version

  • QBRTCClient:
    • getInstance() renamed to getInstance(Context) to create and initialize QBRTCClient.
    • prepareToProcessCalls(Context) deprecated and replaced with prepareToProcessCalls()
    • methods init(Context), isInitiated() are deprecated as unnecessary
    • methods addVideoTrackCallbacksListener(QBRTCClientVideoTracksCallbacks), removeVideoTrackCallbacksListener(QBRTCClientVideoTracksCallbacks), addConnectionCallbacksListener(QBRTCSessionConnectionCallbacks), removeConnectionCallbacksListener(QBRTCSessionConnectionCallbacks) are deprecated and moved to QBRTCSession
    • method close() replaced by destroy().
  • QBRTCSession:
    • added methods addVideoTrackCallbacksListener(QBRTCClientVideoTracksCallbacks), removeVideoTrackCallbacksListener(QBRTCClientVideoTracksCallbacks), addSessionCallbacksListener(QBRTCSessionConnectionCallbacks), removeSessionnCallbacksListener(QBRTCSessionConnectionCallbacks), addSignalingCallback(QBRTCSignalingCallback), removeSignalingCallback(QBRTCSignalingCallback)
    • added method getMediaStreamManager() to get QBMediaStreamManager
    • added method getPeerChannel(Integer) to get QBPeerChannel
    • methods setAudioEnabled(boolean), setVideoEnabled(boolean), getAudioEnability(), getVideoEnability() deprecated and moved to QBMediaStreamManager
  • Interfaces:
    • QBRTCClientConnectionCallbacks renamed to QBRTCSessionConnectionCallbacks
    • QBRTCClientSessionCallbacks - added methods: void onCallAcceptByUser(QBRTCSession session, Integer userID, Map<String, String> userInfo); and void onUserNoActions(QBRTCSession session, Integer userID);
    • added QBRTCSignalingCallback to indicate whether signaling packet was sent
  • Classes:
    • added QBRTCMediaConfig to set audio/video settings/quality for session
    • added QBMediaStreamManager - to manage audio/video tracks
    • added QBPeerChannel to get information about peer connection
    • added RTCGLVideoView for drawing local & remote video frames and deprecated QBGLVideoView as unsupported


Initialize framework with application credentials

For fast applying of user credentials use code below:

static final String APP_ID = "961";
static final String AUTH_KEY = "PBZxXW3WgGZtFZv";
static final String AUTH_SECRET = "vvHjRbVFF6mmeyJ";
static final String ACCOUNT_KEY = "961";
//
QBSettings.getInstance().init(getApplicationContext(), APP_ID, AUTH_KEY, AUTH_SECRET);
QBSettings.getInstance().setAccountKey(ACCOUNT_KEY);


Create session, sign in user and set QBChatService up

In order to use QuickBlox Chat APIs you must:

  • Create a session with user
  • Log in chat service


Please follow lines below:

String login = "login";
String password = "password"; 
 
final QBUser user = new QBUser(login, password);
 
// CREATE SESSION WITH USER
// If you use create session with user data,  
// then the user will be logged in automatically
QBAuth.createSession(login, password, new QBEntityCallback<QBSession>() {
   @Override
   public void onSuccess(QBSession session, Bundle bundle) {
 
      user.setId(session.getUserId());                
 
      // INIT CHAT SERVICE
      chatService = QBChatService.getInstance();
 
      // LOG IN CHAT SERVICE
      chatService.login(user, new QBEntityCallback<QBUser>() {
 
         @Override
         public void onSuccess() {
            // success
         }
 
         @Override
         public void onError(QBResponseException errors) {
            //error
         }
      });
   }
 
   @Override
   public void onError(QBResponseException errors) {
      //error
   }
});


Set QBRTCClient instance up

To use QuickBlox WebRTC video calls follow next steps:

  • Add signalling manager
  • Prepare your activity class to audio/video calls
  • Set video view for remote video track
  • Set video view for local video track
  • Notify RTCClient that you are ready to receive calls


Add signalling manager

To enable an ability of receiving incoming WebRTC calls, you need to add WEBRTC signaling to QBRTCClient:

QBChatService.getInstance().getVideoChatWebRTCSignalingManager()
        .addSignalingManagerListener(new QBVideoChatSignalingManagerListener() {
            @Override
            public void signalingCreated(QBSignaling qbSignaling, boolean createdLocally) {
                if (!createdLocally) {
                    QBRTCClient.getInstance(this).addSignaling((QBWebRTCSignaling) qbSignaling);
                }
            }
        });

Prepare your activity class to audio/video calls

To enable an ability to receive callbacks about current QBRTCSession instance state, about video tracks (local and remotes) and session's peer connections states you must implement appropriate interfaces by calling next methods on QBRTCSession instance:

public void addSessionCallbacksListener(QBRTCSessionConnectionCallbacks callback) 
public void addVideoTrackCallbacksListener(QBRTCClientVideoTracksCallbacks callback)

and next method on QBRTCClient instance:

public void addSessionCallbacksListener(QBRTCClientSessionCallbacks callback)

Setup views

Starting from sdk 2.6.1 RTCGLVideoView is deprecated. Use QBRTCSurfaceView to render frames.

Set in your layout views for remote and local video tracks:

<com.quickblox.videochat.webrtc.view.RTCGLVideoView
    xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:custom="http://schemas.android.com/apk/res-auto"
    android:id="@+id/localView"
    android:layout_width="100dp"
    android:layout_height="100dp" />
<com.quickblox.videochat.webrtc.view.RTCGLVideoView
    android:id="@+id/opponentView"
    android:layout_width="100dp"
    android:layout_height="100dp" />

QBRTCSurfaceView

In sdk 2.6.1 QBRTCSurfaceView was introduced.

QBRTCSurfaceView it's a SurfaceView renderers video track.
It has lifecycle for rendering. It uses init() method for preparing to render and release() to release resource when video track no more exist.
QBRTCSurfaceView is automatically initialized after the surface is created - on surfaceCreated() method callback.
You can manually initialize QBRTCSurfaceView using Egl context getting from QBRTCClient. Use this method only when Activity is alive and GL resources are exist.

QBRTCSurfaceView surfaceView = ...;
EglBase eglContext = QBRTCClient.getInstance(getContext()).getEglContext();
surfaceView.init(eglContext.getEglBaseContext(), null);


Method release() should be called when video track is no more valid, for ex when you receive onConnectionClosedForUser() callback from QBRTCSession or when QBRTCSession is going to close. But you should call release() before the Activity is destroyed and the EGLContext is still valid. If you don't call this method(), the GL resources might leak.

QBRTCSurfaceView allows to use several views on screen layout and to overlap each other.

Here is the QBRTCSurfaceView interface:

QBRTCSurfaceView.init(EglBase.Context, RendererCommon.RendererEvents);//Initialize this view using webrtc Egl context, It is allowed to call init() to reinitialize the view after a previous init()/release() cycle.
QBRTCSurfaceView.release(); // releases all related GL resources 
QBRTCSurfaceView.setScalingType(scalingType); //Set how the video will fill the allowed layout area
QBRTCSurfaceView.setMirror(mirror); //Set if the video stream should be mirrored or not.
QBRTCSurfaceView.requestLayout(); // Request to invalidate view when something has changed

To render received video track from opponent use:

private void fillVideoView(int userId, QBRTCSurfaceView videoView, QBRTCVideoTrack videoTrack) {
    videoTrack.addRenderer(new VideoRenderer(videoView));
}

Notify RTCClient that you are ready to receive calls

As soon as your app is ready for calls processing and activity exists, use code below in activity class:

QBRTCClient.getInstance(this).prepareToProcessCalls();

Pay attention if you forgot to add signalling manager you will not be able to process calls.


Track session states via QBRTCClientSessionCallbacks interface

For managing all session's states you must implement interface QBRTCClientSessionCallbacks.

Once you called method QBRTCClient.getInstance(this).prepareToProcessCalls() and added instance of class, that implements QBRTCClientSessionCallbacks to QBRTCClient via method QBRTCClient.getInstance(this).addSessionCallbacksListener(QBRTCClientSessionCallbacks listener), you will start receive sessions callbacks.

**
 * Called each time when new session request is received.
 */
void onReceiveNewSession(QBRTCSession session);
 
/**
 * Called in case when user didn't answer in timer expiration period
 */
void onUserNotAnswer(QBRTCSession session, Integer userID);
 
/**
 * Called in case when opponent has rejected you call
 */
void onCallRejectByUser(QBRTCSession session, Integer userID, Map<String, String> userInfo);
 
/**
 * Called in case when opponent has accepted you call
 */
void onCallAcceptByUser(QBRTCSession session, Integer userID, Map<String, String> userInfo);
 
/**
 * Called in case when opponent hung up
 */
void onReceiveHangUpFromUser(QBRTCSession session, Integer userID);
 
/**
 * Called in case when user didn't make any actions on received session
 */
void onUserNoActions(QBRTCSession session, Integer userID);
 
/**
 * Called in case when session will close
 */
void onSessionStartClose(QBRTCSession session);
 
/**
 * Called when session is closed.
 */
void onSessionClosed(QBRTCSession session);

To listen for the callbacks use the following methods:

QBRTCClient.getInstance(this).addSessionCallbacksListener(this);
QBRTCClient.getInstance(this).removeSessionCallbacksListener(this);


To track only main session events you can use QBRTCSessionEventsCallback.

/**
 * Called in case when user didn't answer in timer expiration period
 */
void onUserNotAnswer(QBRTCSession session, Integer userID);
 
/**
 * Called in case when opponent has rejected you call
 */
void onCallRejectByUser(QBRTCSession session, Integer userID, Map<String, String> userInfo);
 
/**
 * Called in case when opponent has accepted you call
 */
void onCallAcceptByUser(QBRTCSession session, Integer userID, Map<String, String> userInfo);
 
/**
 * Called in case when opponent hung up
 */
void onReceiveHangUpFromUser(QBRTCSession session, Integer userID);
 
 
/**
 * Called when session is closed.
 */
void onSessionClosed(QBRTCSession session);


To subscribe for events use the same methods:

QBRTCClient.getInstance(this).addSessionCallbacksListener(this);
QBRTCClient.getInstance(this).removeSessionCallbacksListener(this);

Track connection state

To manage connection with a user you should implement interface QBRTCSessionStateCallback.

/**
 * Called in case when connection with the opponent is established
 */
void onConnectedToUser(QBRTCSession session, Integer userID);
 
/**
 * Called in case when connection is closed
 */
void onConnectionClosedForUser(QBRTCSession session, Integer userID);
 
/**
 * Called in case when the opponent is disconnected
 */
void onDisconnectedFromUser(QBRTCSession session, Integer userID);


To track extended connection's states use QBRTCSessionConnectionCallbacks which has additional events

/**
 * Called in case when connection establishment process is started
 */
void onStartConnectToUser(QBRTCSession session, Integer userID);
 
/**
 * Called in case when the opponent is disconnected by timeout
 */
void onDisconnectedTimeoutFromUser(QBRTCSession session, Integer userID);
 
/**
 * Called in case when connection has failed with the opponent
 */
void onConnectionFailedWithUser(QBRTCSession session, Integer userID);
 
/**
 * Called in case of some errors occurred during connection establishment process
 */
void onError(QBRTCSession session, QBRTCException exception);

To listen for the callbacks use the following methods:

rtcSession.addSessionCallbacksListener(this);
rtcSession.removeSessionnCallbacksListener(this);


Obtain video tracks via QBRTCClientVideoTracksCallbacks interface

For managing video tracks you must implement interface QBRTCClientVideoTracksCallbacks.

/**
 * Called when local video track was received
 */
void onLocalVideoTrackReceive(QBRTCSession session, QBRTCVideoTrack localVideoTrack);
 
/**
 * Called when remote video track was received
 */
void onRemoteVideoTrackReceive(QBRTCSession session, QBRTCVideoTrack remoteVideoTrack, Integer userID);

To listen for the callbacks use the following methods:

rtcSession.addVideoTrackCallbacksListener(this);
rtcSession.removeVideoTrackCallbacksListener(this);


Obtain audio tracks

For managing audio tracks you must implement interface QBRTCClientAudioTracksCallback.

/**
 * Called when local audio track was received
 */
 void onLocalAudioTrackReceive(QBRTCSession session, QBRTCAudioTrack audioTrack);
 
/**
 * Called when remote audio track was received
 */
void onRemoteAudioTrackReceive(QBRTCSession session, QBRTCAudioTrack audioTrack, Integer userID);

To listen for the callbacks use the following methods:

rtcSession.addAudioTrackCallbacksListener(this);
rtcSession.removeAudioTrackCallbacksListener(this);


Render video stream to view

To set the view for video track you can use the following helper method:

private void fillVideoView(int userId, QBRTCSurfaceView videoView, QBRTCVideoTrack videoTrack,
                               boolean remoteRenderer) {
        videoTrack.addRenderer(new VideoRenderer(videoView));
        updateVideoView(videoView, !remoteRenderer, RendererCommon.ScalingType.SCALE_ASPECT_FILL);
}
 
private void updateVideoView(QBRTCSurfaceView surfaceView, boolean mirror, RendererCommon.ScalingType scalingType) {
        surfaceView.setScalingType(scalingType);
        surfaceView.setMirror(mirror);
        surfaceView.requestLayout();
    }


Refer to https://quickblox.com/developers/Sample-webrtc-android#QBRTCSurfaceView for QBRTCSurfaceView interface.

Tracking signalling messages' state

You can implement QBRTCSignalingCallback interface to track the signalling messages' states, for example:

QBRTCSignalingCallback signalingCallback = new QBRTCSignalingCallback(){
    public void onSuccessSendingPacket(QBSignalingSpec.QBSignalCMD packetType, Integer opponentId){
 
    }
    public void onErrorSendingPacket(QBSignalingSpec.QBSignalCMD packetType, Integer opponentId,
                              QBRTCSignalException e){
    }
}
 
currentSession.addSignalingCallback(signalingCallback);
currentSession.removeSignalingCallback(signalingCallback);


Session management

Start Call

To call the users you should create a session and start call:

//Set conference type 
//There are two types of calls:
// - QB_CONFERENCE_TYPE_VIDEO - for video call;
// - QB_CONFERENCE_TYPE_AUDIO - for audio call;
QBRTCTypes.QBConferenceType qbConferenceType = QB_CONFERENCE_TYPE_VIDEO;
 
 
//Initiate opponents list
List<Integer> opponents = new ArrayList<Integer>();
opponents.add(12345); //12345 - QBUser ID
 
//Set user information 
// User can set any string key and value in user info
// Then retrieve this data from sessions which is returned in callbacks
// and parse them as he wish
Map<String, String> userInfo = new HashMap<>();
userInfo.put("key", "value");
 
//Init session
QBRTCSession session = 
      QBRTCClient.getInstance(this).createNewSessionWithOpponents(opponents, qbConferenceType);
 
//Start call
session.startCall(userInfo);

Accept call

You should process a session received in QBRTCClientSessionCallbacks.onReceiveNewSession(QBRTCSession) callback. There are few ways how to process it:

  • accept incoming call;
  • reject incoming call.

Both of ways are represented below.


For accept call request just use this method:

public void onReceiveNewSession(QBRTCSession session){
 
   // obtain received user info
   Map<String,String> userInfo = session.getUserInfo();
 
   // ..... 
   // ..... your code 
   // .....
 
 
   // Set userInfo
   // User can set any string key and value in user info
   Map<String,String> userInfo = new HashMap<String,String>;
   userInfo.put("Key", "Value");   
 
   // Accept incoming call
   session.acceptCall(userInfo);
}

After accepting the call your opponent will receive a accept signal in appropriate callback method:

public void onCallAcceptByUser(QBRTCSession session, Integer userID, Map<String, String> userInfo){
   // ..... 
   // ..... your code 
   // .....
 
}

Reject call

To reject call request just use this method:

public void onReceiveNewSession(QBRTCSession session){
   // obtain received user info
   Map<String,String> userInfo = session.getUserInfo();
 
   // ..... 
   // ..... your code 
   // .....
 
   // Set userInfo
   // User can set any string key and value in user info
   Map<String,String> userInfo = new HashMap<String,String>;
   userInfo.put("Key", "Value");   
 
   // Rejecting call
   session.rejectCall(userInfo);
}


After rejecting the call your opponent will receive a reject signal in appropriate callback method:

public void onCallRejectByUser(QBRTCSession session, Integer userID, Map<String, String> userInfo){
   // ..... 
   // ..... your code 
   // .....
 
}

Hang up

To hang up a call:

// Set userInfo
// User can set any string key and value in user info
Map<String,String> userInfo = new HashMap<String,String>;
userInfo.put("Key", "Value");   
 
session.hangUp(userInfo);


After this your opponent will receive a hangUp signal

public void onReceiveHangUpFromUser(QBRTCSession session, Integer userID){
 
}


Stop receiving calls and release resource

If you don't want to receive and process sessions amymore just call:

QBRTCClient.getInstance(this).destroy();

This method unregisters QBRTCClient from receiving any video chat events, clear session callbacks and closes existing signal channels.

Session Media management

Manage streams

To manage audio & video streams QBRTCSession provides QBMediaStreamManager class.
QBMediaStreamManager holds user's local audio & video tracks and provides way to change current video capturer.
Pay attentions, QBMediaStreamManager attaches to QBRTCSession lifecycle. According to QBRTCSession lifecycle, you should use QBMediaStreamManager only when QBRTCSession is active or has been started.

Disable / enable audio stream

You can disable / enable and check state of the audio stream during a call:

QBMediaStreamManager mediaStreamManager = currentSession.getMediaStreamManager();
 
mediaStreamManager.setAudioEnabled(false); // disable audio stream
 
mediaStreamManager.setAudioEnabled(true);  // enable audio stream 
 
mediaStreamManager.isAudioEnabled();    // returns true if audio track enabled


Starting from sdk 3.3.0 you can manage tracks directly :

QBRTCAudioTrack localAudioTrack = currentSession.getMediaStreamManager().getLocalAudioTrack();
localAudioTrack.setEnabled(isAudioEnabled); // enable or disable audio stream 
 
localAudioTrack.enabled();//checks whether track is enabled

Disable / enable video stream

You can disable / enable and check state of the video stream during a call:

QBMediaStreamManager mediaStreamManager = currentSession.getMediaStreamManager();
 
mediaStreamManager.setVideoEnabled(false); // disable video stream
 
mediaStreamManager.setVideoEnabled(true);  // enable video stream
 
mediaStreamManager.isVideoEnabled();    // returns true if video track enabled


Starting from sdk 3.3.0 you can manage tracks directly :

QBRTCVideoTrack localVideoTrack = currentSession.getMediaStreamManager().getLocalVideoTrack();
localVideoTrack.setEnabled(isVideoEnabled); / enable or disable video stream
 
localVideoTrack.enabled();//checks whether track is enabled

Set video capturer

Starting from sdk 3.3.0 you can set current video source to be captured.
There are 2 possible ways:

  1. capturer frames from camera - use QBRTCCameraVideoCapturer
  2. capturer device screen - use QBRTCScreenCapturer


To define current capturer use QBMediaStreamManager api :

currentSession.getMediaStreamManager().setVideoCapturer(new QBRTCCameraVideoCapturer(this, null));

By default camera capturer is used when session is started.

To define default capturer strategy which will be used when session is started you can use QBRTCMediaCapturerCallback on QBRTCSession:

currentSession.setMediaCapturerCallback(new QBRTCMediaCapturerCallback() {
     @Override
     public void onInitLocalMediaStream(QBRTCMediaStream qbrtcMediaStream) {
            currentSession.getMediaStreamManager().setVideoCapturer(new QBRTCScreenCapturer(data, null));
            }
});


QBRTCMediaCapturerCallback will be invoked when QBRTCSession is instantiated and establishes connection after calling methods startCall or acceptCall.


For managing received local & remote video tracks just define QBRTCClientVideoTracksCallbacks:https://quickblox.com/developers/Sample-webrtc-android#Obtain_video_tracks_via_QBRTCClientVideoTracksCallbacks_interface

Switch camera

You can switch the video camera position during a call (Default: front camera):

QBMediaStreamManager mediaStreamManager = currentSession.getMediaStreamManager();
 
boolean done = mediaStreamManager.switchCameraInput(new Runnable() {
    @Override
    public void run() {
       // switch done
    }
});
 
int currentCameraId = mediaStreamManager.getCurrentCameraId();


In sdk 3.3.0 QBRTCCameraVideoCapturer was introduced to manage camera settings. To switch camera just use

QBRTCCameraVideoCapturer videoCapturer = (QBRTCCameraVideoCapturer) (currentSession.getMediaStreamManager().getVideoCapturer());
 
videoCapturer.switchCamera(cameraSwitchHandler);


Change capture format

Starting from sdk 2.6.1 you can change capture format at any time while session is active.

QBMediaStreamManager mediaStreamManager = currentSession.getMediaStreamManager();
 
mediaStreamManager.changeCaptureFormat(width, height, framerate);


In sdk 3.3.0 QBRTCCameraVideoCapturer was introduced to manage camera settings. To change capture format just use:

QBRTCCameraVideoCapturer videoCapturer = (QBRTCCameraVideoCapturer) (currentSession.getMediaStreamManager().getVideoCapturer());
 
videoCapturer.changeCaptureFormat(width, height, framerate);


Screen sharing

We are happy to introduce you a new feature of Quickblox WebRTC SDK — Screen sharing starting from SDK version 3.3.0

Screenshot 20170105-181702.png Screenshot 20170105-181512.png Screenshot 20170110-172054.png Screenshot 20170110-171606.png


Screen sharing allows you to share your device screen with all of your opponents.

To integrate feature just use QBRTCScreenCapturer class and follow these steps:

  1. Request projection permission from user:
     if (Build.VERSION.SDK_INT > Build.VERSION_CODES.LOLLIPOP) {
         QBRTCScreenCapturer.requestPermissions(CallActivity.this); //Request permission to share device screen
    }

  2. Listen to granted permission inside Activity:
    public class CallActivity extends BaseActivity {
        ...
     
        @Override
        protected void onActivityResult(int requestCode, int resultCode, Intent data) {
            if (requestCode == QBRTCScreenCapturer.REQUEST_MEDIA_PROJECTION) {
                if (resultCode == Activity.RESULT_OK) {
                    startScreenSharing(data);
                }
            }
        }
    }

  3. Set QBRTCScreenCapturer as current video capturer to start screen sharing:
     private void startScreenSharing(Intent data){//pass data from permission request
            QBRTCSession currentSession = ..;
            currentSession.getMediaStreamManager().setVideoCapturer(new QBRTCScreenCapturer(data, null));
        }


    Opponents will receive your video track in the same way as for camera frames via onRemoteVideoTrackReceive callback method :http://quickblox.com/developers/Sample-webrtc-android#Obtain_video_tracks_via_QBRTCClientVideoTracksCallbacks_interface


Here is the QBRTCScreenCapturer interface:

QBRTCScreenCapturer.requestPermissions - requests permission to share device screen
QBRTCScreenCapturer(Intent, MediaProjection.Callback) - constructor
 
QBRTCScreenCapturer.changeCaptureFormat(width, height, framerate); - change capture format 
QBRTCScreenCapturer.startCapture(width, height, framerate) - start capture with defined format
QBRTCScreenCapturer.stopCapture - stop capturing

Manage Audio settings

Starting from release 2.5.2 QBRTCSession doesn't manage audio settings anymore. You can use AppRTCAudioManager to manage audio settings manually.

Here is the AppRTCAudioManager interface:

AppRTCAudioManager.create - constructor
AppRTCAudioManager.init - initialize audio settings to "communication" mode, sets default audio device
AppRTCAudioManager.setDefaultAudioDevice - sets audio device by default
AppRTCAudioManager.setAudioDevice - sets current audio device
AppRTCAudioManager.getAudioDevices - Returns current set of available/selectable audio devices.
AppRTCAudioManager.setManageHeadsetByDefault -

whether AppRTCAudioManager will handle headset state or no. If "true" AppRTCAudioManagerr will handle headset state and selected audio channel. In this case when headset is plugged AppRTCAudioManager sets headset as current audio device. When headset is unplugged, AppRTCAudioManager sets audio device to default one. If "false" headset state will be handled by android default behaviour. AppRTCAudioManager will only notify about headset's state.


AppRTCAudioManager.setOnWiredHeadsetStateListener - sets Listener to be invoked when headset plugged/unplugged
AppRTCAudioManager.setOnAudioManagerStateListener - sets Listener to be invoked when audio device changed
AppRTCAudioManager.close - close audio manager, audio settings will return to default.


Media configuration

You can use methods of QBRTCMediaConfig class instance to configure a various list of media settings like video/audio codecs, bitrate, fps etc.

More examples how to use it you can find on SettingsUtil.java class.

public static void setAudioCodec(AudioCodec audioCodec);
 
public static void setVideoCodec(VideoCodec videoCodec);
 
public static void setVideoWidth(int videoWidth);
 
public static void setVideoHeight(int videoHeight);
 
public static void setVideoFps(int videoFps);
 
public static void setVideoStartBitrate(int videoStartBitrate);
 
public static void setAudioStartBitrate(int audioStartBitrate);
 
public static void setVideoHWAcceleration(boolean videoHWAcceleration);
 
public static void setUseBuildInAEC(boolean useBuildInAEC); // Enable built-in AEC if device supports it
 
public static void setUseOpenSLES(boolean useOpenSLES); //Allow OpenSL ES audio if device supports it
 
public static void setAudioProcessingEnabled(boolean audioProcessingEnabled); //Enabling/Disabling audio processing - added for audio performance.

Configuration

You can use methods of QBRTCConfig class instance to set your application up according to your personal needs. All fields of config file and example how to use them are below:

/**
 * Set dialing time interval
 * Default value is 5 sec
 */
public static void setDialingTimeInterval(long dialingTimeInterval);
 
/**
 * Set answer time interval
 * Default value is 60 sec
 */
public static void setAnswerTimeInterval(long answerTimeInterval);
 
/**
 * Set max connections in conference
 * Default value is 10 sec
 */
public static void setMaxOpponentsCount(Integer maxOpponentsCount);
 
/**
 * Set max allowed time to repair a connection after it was lost.
 * Default value is 10 sec
 */
public static void setDisconnectTime(Integer disconnectTime);
 
/**
 * Set list of ice servers.
 * Default value is QuickBlox servers
 */
public static void setIceServerList(List<PeerConnection.IceServer> iceServerList);


An example of QBRTCConfig fields modification is represented below:

// Set Not answer time
QBRTCConfig.setAnswerTimeInterval(160);
 
// Set dialing interval time
QBRTCConfig.setDialingTimeInterval(10);

Set custom ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if multiple options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting multiple TURN servers allows your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

// Set custom ice servers up. Use it in case you want set YOUR OWN servers instead of defaults
List<PeerConnection.IceServer> iceServerList = new LinkedList<>();
iceServerList.add(new PeerConnection.IceServer("turn:numb.default.com", "default@default.com", "default@default.com"));
iceServerList.add(new PeerConnection.IceServer("turn:numb.default.com:1234?transport=udp", "default@default.com", "petrbubnov@default.com"));
iceServerList.add(new PeerConnection.IceServer("turn:numb.default.com:1234?transport=tcp", "default@default.com", "default@default.com"));
QBRTCConfig.setIceServerList(iceServerList);

WebRTC Stats reporting

From v2.6 you are able to observe stats provided by WebRTC.

To start collecting report information do the following:

QBRTCConfig.setStatsReportInterval(60); // 60 seconds - update statistics interval

And callback listener to QBRTCSession that implements QBRTCStatsReportCallback which will be called when webrtc stats is fetched for particular peer.

QBRTCSession currentSession = ...;
currentSession.addStatsReportCallback(new QBRTCStatsReportCallback() {
                @Override
                public void onStatsReportUpdate(QBRTCStatsReport statsReport, Integer userId) {
                    String audiobitrate = statsReport.getAudioReceivedBitrate();
                    String videobitrate = statsReport.getVideoReceivedBitrate();
                    String audioSendInputLevel = statsReport.getAudioSendInputLevel();
                }
            });

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is currently speaking/talking.