Sample-webrtc-ios

Ask tech team
From QuickBlox Developers (API docs, code samples, SDK)
(Redirected from SimpleSample-videochat-ios)
Jump to: navigation, search


Contents

Sources

Project homepage on GIT — https://github.com/QuickBlox/quickblox-ios-sdk/tree/master/sample-videochat-webrtc

Download ZIP - https://github.com/QuickBlox/quickblox-ios-sdk/archive/master.zip

Overview

The VideoChat code sample allows you to easily add video calling and audio calling features into your iOS app. Enable a video call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our new feature of QuickbloxWebRTC SDKScreen sharing


Login iOS.PNG Users iOS.PNG Incoming Call.png Group iOS.PNG

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad 2+.
    • iPod Touch 5+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Started with Video Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following commands in Terminal.app:

$ sudo gem install cocoapods
$ pod setup

Step 2: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

$ touch Podfile
$ open -e Podfile

TextEdit should open showing an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

source 'https://github.com/CocoaPods/Specs.git'
platform :ios, '8.0'
pod 'Quickblox-WebRTC', '~> 2.4.1'
pod 'QuickBlox'

Step 3: Installing Dependencies

Now you can install the dependencies in your project:

$ pod install

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

$ open ProjectName.xcworkspace

Step 4: Importing Headers

At this point, everything is in place for you to start using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

#import <SystemConfiguration/SystemConfiguration.h>
#import <MobileCoreServices/MobileCoreServices.h>
#import <Quickblox/Quickblox.h>
#import <QuickbloxWebRTC/QuickbloxWebRTC.h>

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Install Quickblox iOS SDK

QuickbloxWebRTC.framework

Step 2: Add the framework to your Xcode Project

Drag the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination's group folder" checkbox is checked.

Source tree.png

Step 3: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework


Webrtc build phase screen.png

Step 4: Embedded binary for Dynamic framework

From version 2.4 QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Embedded framework webrtc.png

Step 5: Importing Headers

At this point, everything is in place for you to start using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

#import <SystemConfiguration/SystemConfiguration.h>
#import <MobileCoreServices/MobileCoreServices.h>
#import <Quickblox/Quickblox.h>
#import <QuickbloxWebRTC/QuickbloxWebRTC.h>

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

bash "${BUILT_PRODUCTS_DIR}/${FRAMEWORKS_FOLDER_PATH}/QuickbloxWebRTC.framework/strip-framework.sh"

This fixes a known Apple bug, that does not allowing to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

// Initialize QuickbloxWebRTC and configure signaling
// You should call this method before any interact with QuickbloxWebRTC
[QBRTCClient initializeRTC];
 
// Call this method when you finish your work with QuickbloxWebRTC
[QBRTCClient deinitializeRTC];
// Initialize QuickbloxWebRTC and configure signaling
// You should call this method before any interact with QuickbloxWebRTC
QBRTCClient.initializeRTC()
 
// Call this method when you finish your work with QuickbloxWebRTC
QBRTCClient.deinitializeRTC()


Call users

To call users just use this method:

[[QBRTCClient instance] addDelegate:self]; // self class must conform to QBRTCClientDelegate protocol
 
// 2123, 2123, 3122 - opponent's
NSArray *opponentsIDs = @[@3245, @2123, @3122];
QBRTCSession *newSession = [[QBRTCClient instance] createNewSessionWithOpponents:opponentsIDs
                                                              withConferenceType:QBRTCConferenceTypeVideo];
// userInfo - the custom user information dictionary for the call. May be nil.
NSDictionary *userInfo = @{ @"key" : @"value" };
[newSession startCall:userInfo];
QBRTCClient.instance().addDelegate(self) // self class must conform to QBRTCClientDelegate protocol
 
// 2123, 2123, 3122 - opponent's
let opponentsIDs = [3245, 2123, 3122]
let newSession = QBRTCClient.instance().createNewSessionWithOpponents(opponentsIDs, withConferenceType: QBRTCConferenceType.Video)
// userInfo - the custom user information dictionary for the call. May be nil.
let userInfo :[String:String] = ["key":"value"]
newSession.startCall(userInfo)


After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per 5 second for a duration of 45 seconds (you can configure these settings with QBRTCConfig):

#pragma mark -
#pragma mark QBRTCClientDelegate
 
- (void)didReceiveNewSession:(QBRTCSession *)session userInfo:(NSDictionary *)userInfo {
 
    if (self.session) {
        // we already have a video/audio call session, so we reject another one
        // userInfo - the custom user information dictionary for the call from caller. May be nil.
        NSDictionary *userInfo = @{ @"key" : @"value" };
        [session rejectCall:userInfo];
        return;
    }
    self.session = session;
}
// MARK: QBRTCClientDelegate
 
func didReceiveNewSession(session: QBRTCSession!, userInfo: [NSObject : AnyObject]!) {
 
    if self.session != nil {
        // we already have a video/audio call session, so we reject another one
        // userInfo - the custom user information dictionary for the call from caller. May be nil.
        let userInfo :[String:String] = ["key":"value"]
        session.rejectCall(userInfo)
    }
    else {
        self.session = session
    }
}

self.session - this refers to this session. Each particular audio - video call has a unique sessionID. This allows you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to 60 seconds:

[QBRTCConfig setAnswerTimeInterval:60];
 QBRTCConfig.setAnswerTimeInterval(60)

Accept a call

To accept a call request just use this method:

// userInfo - the custom user information dictionary for the accept call. May be nil.
NSDictionary *userInfo = @{ @"key" : @"value" };
[self.session acceptCall:userInfo];
// userInfo - the custom user information dictionary for the accept call. May be nil.
let userInfo :[String:String] = ["key":"value"]
self.session?.acceptCall(userInfo)

After this your opponent will receive an accept signal:

#pragma mark -
#pragma mark QBRTCClientDelegate
 
- (void)session:(QBRTCSession *)session acceptedByUser:(NSNumber *)userID userInfo:(NSDictionary *)userInfo {
 
}
// MARK: QBRTCClientDelegate
 
func session(session: QBRTCSession!, acceptedByUser userID: NSNumber!, userInfo: [NSObject : AnyObject]!) {
 
}

Reject a call

To reject a call request just use this method:

// userInfo - the custom user information dictionary for the reject call. May be nil.
NSDictionary *userInfo = @{ @"key" : @"value" };
[self.session rejectCall:userInfo];
 
// and release session instance
self.session = nil;
// userInfo - the custom user information dictionary for the reject call. May be nil.
let userInfo :[String:String] = ["key":"value"]
self.session?.rejectCall(userInfo)
 
// and release session instance
self.session = nil;

After this your opponent will receive a reject signal:

#pragma mark -
#pragma mark QBRTCClientDelegate
 
- (void)session:(QBRTCSession *)session rejectedByUser:(NSNumber *)userID userInfo:(NSDictionary *)userInfo  {
    NSLog(@"Rejected by user %@", userID);
}
// MARK: QBRTCClientDelegate
 
func session(session: QBRTCSession!, rejectedByUser userID: NSNumber!, userInfo: [NSObject : AnyObject]!) {
    print("Rejected by user \(userID)")
}

Connection life-cycle

Called when connection is initiated with user:

#pragma mark -
#pragma mark QBRTCClientDelegate
 
- (void)session:(QBRTCSession *)session startedConnectingToUser:(NSNumber *)userID {
 
    NSLog(@"Started connecting to user %@", userID);
}
// MARK: QBRTCClientDelegate
 
func session(session: QBRTCSession!, startedConnectingToUser userID: NSNumber!) {
    print("Started connecting to user \(userID)")
}

Called when connection is closed for user

#pragma mark -
#pragma mark QBRTCClientDelegate
 
- (void)session:(QBRTCSession *)session connectionClosedForUser:(NSNumber *)userID {
 
    NSLog(@"Connection is closed for user %@", userID);
}
// MARK: QBRTCClientDelegate
 
func session(session: QBRTCSession!, connectionClosedForUser userID: NSNumber!) {
    print("Connection is closed for user \(userID)")
}

Called in case when connection is established with user:

#pragma mark -
#pragma mark QBRTCClientDelegate
 
- (void)session:(QBRTCSession *)session connectedToUser:(NSNumber *)userID {
 
    NSLog(@"Connection is established with user %@", userID);
}
// MARK: QBRTCClientDelegate
 
func session(session: QBRTCSession!, connectedToUser userID: NSNumber!) {
    print("Connection is established with user \(userID)")
}

Called in case when user is disconnected:

#pragma mark -
#pragma mark QBRTCClientDelegate
 
- (void)session:(QBRTCSession *)session disconnectedFromUser:(NSNumber *)userID {
    NSLog(@"Disconnected from user %@", userID);
}
// MARK: QBRTCClientDelegate
 
func session(session: QBRTCSession!, disconnectedFromUser userID: NSNumber!) {
    print("Disconnected from user \(userID)");
}

Called in case when user did not respond to your call within timeout .
note: use +[QBRTCConfig setAnswerTimeInterval:value] to set answer time interval

#pragma mark -
#pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session userDidNotRespond:(NSNumber *)userID {
    NSLog(@"User %@ did not respond to your call within timeout", userID);
}
// MARK: QBRTCClientDelegate
 
func session(session: QBRTCSession!, userDidNotRespond userID: NSNumber!) {
    print("User \(userID) did not respond to your call within timeout")
}

Called in case when connection failed with user.

#pragma mark -
#pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session connectionFailedForUser:(NSNumber *)userID {
    NSLog(@"Connection has failed with user %@", userID);
}
// MARK: QBRTCClientDelegate
func session(session: QBRTCSession!, connectionFailedForUser userID: NSNumber!) {
    print("Connection has failed with user \(userID)")
}

States

Called when QBRTCSession state was changed. Session's state might be new, pending, connecting, connected and closed.

#pragma mark -
#pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session didChangeState:(QBRTCSessionState)state {
    NSLog(@"Session did change state to %tu", state);
}
// MARK: QBRTCClientDelegate
func session(session: QBRTCSession!, didChangeState state: QBRTCSessionState!) {
    print("Session did change state to \(state)")
}

Called when session connection state changed for a specific user. Connection state might be unknown, new, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no answer, rejected, hangup and failed.

#pragma mark -
#pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session didChangeConnectionState:(QBRTCConnectionState)state forUser:(NSNumber *)userID {
    NSLog(@"Session did change state to %tu for userID %@", state, userID);
}
// MARK: QBRTCClientDelegate
func session(session: QBRTCSession!, didChangeConnectionState state: QBRTCConnectionState!, forUser userID: NSNumber!) {
    print("Session did change state to \(state) for userID \(userID)")
}

Manage remote media tracks

In order to show video views with streams which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

#pragma mark -
#pragma mark QBRTCClientDelegate
 
//Called in case when receive remote video track from opponent
- (void)session:(QBRTCSession *)session receivedRemoteVideoTrack:(QBRTCVideoTrack *)videoTrack fromUser:(NSNumber *)userID {
 
    // we suppose you have created UIView and set it's class to QBRTCRemoteVideoView class
    // also we suggest you to set view mode to UIViewContentModeScaleAspectFit or
    // UIViewContentModeScaleAspectFill
    [self.opponentVideoView setVideoTrack:videoTrack];
}
// MARK: QBRTCClientDelegate
 
//Called in case when receive remote video track from opponent
func session(session: QBRTCSession!, receivedRemoteVideoTrack videoTrack: QBRTCVideoTrack!, fromUser userID: NSNumber!) {
    // we suppose you have created UIView and set it's class to QBRTCRemoteVideoView class
    // also we suggest you to set view mode to UIViewContentModeScaleAspectFit or
    // UIViewContentModeScaleAspectFill
    self.opponentVideoView?.setVideoTrack(videoTrack) 
}

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

#pragma mark -
#pragma mark QBRTCClientDelegate
 
//Called in case when receive remote audio track from opponent
- (void)session:(QBRTCSession *)session receivedRemoteAudioTrack:(QBRTCAudioTrack *)audioTrack fromUser:(NSNumber *)userID {
 
    // mute specific user audio track here (for example)
    // you can also always do it later by using '[QBRTCSession remoteAudioTrackWithUserID:]' method
    audioTrack.enabled = NO;
}
// MARK: QBRTCClientDelegate
 
//Called in case when receive remote video track from opponent
func session(session: QBRTCSession!, receivedRemoteAudioTrack audioTrack: QBRTCAudioTrack!, fromUser userID: NSNumber!) {
    // mute specific user audio track here (for example)
    // you can also always do it later by using '[QBRTCSession remoteAudioTrackWithUserID:]' method
    audioTrack.enabled = false
}

You can always get both remote video and audio tracks for a specific user ID in call using these QBRTCSession methods:

/**
 *  Remote audio track with opponent user ID.
 *
 *  @param userID opponent user ID
 *
 *  @return QBRTCAudioTrack audio track instance
 */
- (QBRTCAudioTrack *)remoteAudioTrackWithUserID:(NSNumber *)userID;
 
/**
 *  Remote video track with opponent user ID.
 *
 *  @param userID opponent user ID
 *
 *  @return QBRTCVideoTrack video track instance
 */
- (QBRTCVideoTrack *)remoteVideoTrackWithUserID:(NSNumber *)userID;

Manage local video track

In order to show your local video track from camera you should create UIView on storyboard and then use the following code:

// your view controller interface code
 
@interface CallController()
@property (weak, nonatomic) IBOutlet UIView *localVideoView; // your video view to render local camera video stream
@property (strong, nonatomic) QBRTCCameraCapture *videoCapture;
@property (strong, nonatomic) QBRTCSession *session;
@end
 
@implementation CallController
 
- (void)viewDidLoad {
    [super viewDidLoad];
    [[QBRTCClient instance] addDelegate:self];
 
    QBRTCVideoFormat *videoFormat = [[QBRTCVideoFormat alloc] init];
    videoFormat.frameRate = 30;
    videoFormat.pixelFormat = QBRTCPixelFormat420f;
    videoFormat.width = 640;
    videoFormat.height = 480;
 
    // QBRTCCameraCapture class used to capture frames using AVFoundation APIs
    self.videoCapture = [[QBRTCCameraCapture alloc] initWithVideoFormat:videoFormat position:AVCaptureDevicePositionFront]; // or AVCaptureDevicePositionBack
 
    // add video capture to session's local media stream
    // from version 2.3 you no longer need to wait for 'initializedLocalMediaStream:' delegate to do it
    self.session.localMediaStream.videoTrack.videoCapture = self.videoCapture;
 
    self.videoCapture.previewLayer.frame = self.localVideoView.bounds;
    [self.videoCapture startSession];
 
    [self.localVideoView.layer insertSublayer:self.videoCapture.previewLayer atIndex:0];
 
   // start call
}
 
...
 
@end
// your view controller interface code
 
import Foundation
 
class CallController: UIViewController,QBRTCClientDelegate {
 
    @IBOutlet weak var localVideoView : UIView! // your video view to render local camera video stream
 
    var videoCapture: QBRTCCameraCapture?
    var session: QBRTCSession?    
 
    override func viewDidLoad() {
        QBRTCClient.instance().addDelegate(self)
 
        let videoFormat = QBRTCVideoFormat.init()
        videoFormat.frameRate = 30
        videoFormat.pixelFormat = QBRTCPixelFormat.Format420v
        videoFormat.width = 640
        videoFormat.height = 480
 
        // QBRTCCameraCapture class used to capture frames using AVFoundation APIs
        self.videoCapture = QBRTCCameraCapture.init(videoFormat: videoFormat!, position: AVCaptureDevicePosition.Front)
 
        // add video capture to session's local media stream
        // from version 2.3 you no longer need to wait for 'initializedLocalMediaStream:' delegate to do it
        self.session!.localMediaStream.videoTrack.videoCapture = self.videoCapture
 
        self.videoCapture!.previewLayer.frame = self.localVideoView.bounds
        self.videoCapture!.startSession()
 
        self.localVideoView.layer.insertSublayer(self.videoCapture!.previewLayer, atIndex: 0)
 
        // start call
    }
 
    //...
}

Hang up

To hang a up call:

NSDictionary *userInfo = @{ @"key" : @"value" }
[self.session hangUp:userInfo];
 let userInfo :[String:String] = ["key":"value"]
 self.session?.hangUp(userInfo)

After this your opponent's will receive a hangUp signal

#pragma mark -
#pragma mark QBRTCClientDelegate
 
- (void)session:(QBRTCSession *)session hungUpByUser:(NSNumber *)userID userInfo:(NSDictionary *)userInfo {
    //For example:Update GUI
    //
    // Or
    /**
        HangUp when initiator ended a call
    */
    if ([session.initiatorID isEqualToNumber:userID]) {
        [session hangUp:@{}];
    }
}
// MARK: QBRTCClientDelegate
 
func session(session: QBRTCSession!, hungUpByUser userID: NSNumber!, userInfo: [NSObject : AnyObject]!) {
    //For example:Update GUI
    //
    // Or
    /**
     HangUp when initiator ended a call
     */
    if session.initiatorID.isEqualToNumber(userID) {
        session.hangUp(userInfo)
    }
}

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

#pragma mark -
#pragma mark QBRTCClientDelegate
 
- (void)sessionDidClose:(QBRTCSession *)session {
 
    // release session instance
    self.session = nil;
}
// MARK: QBRTCClientDelegate
 
func sessionDidClose(session: QBRTCSession!) {
 
    // release session instance
    self.session = nil;
}

Disable / enable audio stream

You can disable / enable the audio stream during a call:

self.session.localMediaStream.audioTrack.enabled ^= 1;
self.session!.localMediaStream.audioTrack.enabled = !self.session!.localMediaStream.audioTrack.enabled

Please note: due to webrtc restrictions silence will be placed into stream content if audio is disabled.

Disable / enable video stream

You can disable / enable the video stream during a call:

self.session.localMediaStream.videoTrack.enabled ^= 1;
 self.session!.localMediaStream.videoTrack.enabled = !self.session!.localMediaStream.videoTrack.enabled

Please note: due to webrtc restrictions black frames will be placed into stream content if video is disabled.

Switch camera

You can switch the video capture position during a call (Default: front camera):

'videoCapture' below is QBRTCCameraCapture described in CallController above

// to set default (preferred) camera position
- (void)viewDidLoad {
    [super viewDidLoad];
    QBRTCVideoFormat *videoFormat = [[QBRTCVideoFormat alloc] init];
    videoFormat.frameRate = 30;
    videoFormat.pixelFormat = QBRTCPixelFormat420f;
    videoFormat.width = 640;
    videoFormat.height = 480;
 
    self.videoCapture = [[QBRTCCameraCapture alloc] initWithVideoFormat:videoFormat position:AVCaptureDevicePositionFront]; // or AVCaptureDevicePositionBack
}
 
// to change some time after, for example, at the moment of call
 
AVCaptureDevicePosition position = self.videoCapture.position;
AVCaptureDevicePosition newPosition = position == AVCaptureDevicePositionBack ? AVCaptureDevicePositionFront : AVCaptureDevicePositionBack;
 
// check whether videoCapture has or has not camera position
// for example, some iPods do not have front camera 
if ([self.videoCapture hasCameraForPosition:newPosition]) {
    self.videoCapture.position = newPosition;
}
// to set default (preferred) camera position
override func viewDidLoad() {
 
    super.viewDidLoad()
 
    let videoFormat = QBRTCVideoFormat.init()
    videoFormat.frameRate = 30
    videoFormat.pixelFormat = QBRTCPixelFormat.Format420v
    videoFormat.width = 640
    videoFormat.height = 480
 
    // QBRTCCameraCapture class used to capture frames using AVFoundation APIs
    self.videoCapture = QBRTCCameraCapture.init(videoFormat: videoFormat!, position: AVCaptureDevicePosition.Front) // or AVCaptureDevicePositionBack
}
 
// to change some time after, for example, at the moment of call
let position = self.videoCapture?.position
let newPosition = position == AVCaptureDevicePosition.Front ? AVCaptureDevicePosition.Back : AVCaptureDevicePosition.Front
 
// check whether videoCapture has or has not camera position
// for example, some iPods do not have front camera
if self.videoCapture!.hasCameraForPosition(newPosition) {
    self.videoCapture!.position = newPosition
}

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version 2.3. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

//Save current audio configuration before start call or accept call
[[QBRTCAudioSession instance] initialize];
//OR you can initialize audio session with a specific configuration
[[QBRTCAudioSession instance] initializeWithConfigurationBlock:^(QBRTCAudioSessionConfiguration *configuration) {
    // adding blutetooth support
     configuration.categoryOptions |= AVAudioSessionCategoryOptionAllowBluetooth;
     configuration.categoryOptions |= AVAudioSessionCategoryOptionAllowBluetoothA2DP;
 
    // adding airplay support
     configuration.categoryOptions |= AVAudioSessionCategoryOptionAllowAirPlay;
 
     if (_session.conferenceType == QBRTCConferenceTypeVideo) {
        // setting mode to video chat to enable airplay audio and speaker only for video call
          configuration.mode = AVAudioSessionModeVideoChat;
    }
}];
//Set headphone or phone receiver
[QBRTCAudioSession instance].currentAudioDevice = QBRTCAudioDeviceReceiver;
//or set speaker
[QBRTCAudioSession instance].currentAudioDevice = QBRTCAudioDeviceSpeaker;
//deinitialize after session close 
[[QBRTCAudioSession instance] deinitialize];
//Save current audio configuration before start call or accept call
QBRTCAudioSession.instance().initialize()
//OR you can initialize audio session with a specific configuration
QBRTCAudioSession.instance().initializeWithConfigurationBlock { (configuration) -> () in
    // adding blutetooth support
     configuration.categoryOptions |= AVAudioSessionCategoryOptionAllowBluetooth
     configuration.categoryOptions |= AVAudioSessionCategoryOptionAllowBluetoothA2DP
 
    // adding airplay support
     configuration.categoryOptions |= AVAudioSessionCategoryOptionAllowAirPlay
 
     if (_session.conferenceType == QBRTCConferenceTypeVideo) {
        // setting mode to video chat to enable airplay audio and speaker only for video call
          configuration.mode = AVAudioSessionModeVideoChat
    }
}
//Set headphone or phone receiver
QBRTCAudioSession.instance().currentAudioDevice = QBRTCAudioDeviceReceiver
//or set speaker
QBRTCAudioSession.instance().currentAudioDevice = QBRTCAudioDeviceSpeaker
//deinitialize after session close 
QBRTCAudioSession.instance().deinitialize()

QBRTCAudioSession also does have a delegate protocol with helpful methods:

/**
 *  Notifying about current audio device being updated by QBRTCAudioSession.
 *
 *  @param audioSession        QBRTCAudioSession instance
 *  @param updatedAudioDevice  new audio device
 *
 *  @discussion Called, for example, when headphones plugged in. In that case audio will automatically be updated from speaker/receiver to headphones. Headphones are considered to be receiver. You can use this delegate to keep your current audio device state up-to-date in your UI.
 *
 *  @note Only called if audio device was changed by QBRTCAudioSession itself, and not on user request.
 */
- (void)audioSession:(QBRTCAudioSession *)audioSession didChangeCurrentAudioDevice:(QBRTCAudioDevice)updatedAudioDevice;
 
/**
 *  Notifying when audio device change on user request was failed.
 *
 *  @param audioSession QBRTCAudioSession instance
 *  @param error        error
 *
 *  @discussion Called when audio device change is not possible. For example, when audio session options set to speaker only, you cannot update device to receiver, etc.
 */
- (void)audioSession:(QBRTCAudioSession *)audioSession didFailToChangeAudioDeviceWithError:(NSError *)error;

Also QBRTCAudioSession introducing some new properties, that might be also helpful in any case:

/**
 *  Determines whether QBRTCAudioSession is initialized and have saved previous active audio session settings.
 */
@property (nonatomic, readonly, getter=isInitialized) BOOL initialized;
 
/**
 *  Represents permission for WebRTC to initialize the VoIP audio unit.
 *  When set to NO, if the VoIP audio unit used by WebRTC is active, it will be
 *  stopped and uninitialized. This will stop incoming and outgoing audio.
 *  When set to YES, WebRTC will initialize and start the audio unit when it is
 *  needed (e.g. due to establishing an audio connection).
 *  This property was introduced to work around an issue where if an AVPlayer is
 *  playing audio while the VoIP audio unit is initialized, its audio would be
 *  either cut off completely or played at a reduced volume. By preventing
 *  the audio unit from being initialized until after the audio has completed,
 *  we are able to prevent the abrupt cutoff.
 *
 *  @remark As an issue is only affecting AVPlayer, default value is always YES.
 */
@property (assign, nonatomic, getter=isAudioEnabled) BOOL audioEnabled;

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is important not to skip this step.

There is also a UI for setting app background modes in XCode 5. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

Xcode5 capabilities.png

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a red background of the status bar, as well as an additional bar indicating the name of the app holding the active audio session — in this case, your app.

Bg mode.png

Screen sharing

We are happy to introduce you a new feature of QuickbloxWebRTC SDK — Screen sharing.

I can see it.jpg I see you.jpg

Screen sharing allows you to share information from your application to all of your opponents.

It gives you an ability to promote your product, share a screen with formulas to students, distribute podcasts, share video/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the ability to create custom video capture.

Video capture is a base class you should inherit from in order to send frames you your opponents.

Custom video capture

QBRTCVideoCapture class allows to send frames to your opponents.

By inheriting this class you are able to provide custom logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom video capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that allows your application to synchronize its drawing to the refresh rate of the display.

For full source code of custom capture and additional methods please refer to sample-videochat-webrtc sample

/**
 *  By default sending frames in screen share using BiPlanarFullRange pixel format type.
 *  You can also send them using ARGB by setting this constant to NO.
 */
static const BOOL kQBRTCUseBiPlanarFormatTypeForShare = YES;
 
@interface QBRTCScreenCapture()
 
@property (weak, nonatomic) UIView * view;
@property (strong, nonatomic) CADisplayLink *displayLink;
 
@end
 
@implementation QBRTCScreenCapture
 
- (instancetype)initWithView:(UIView *)view {
 
    self = [super init];
    if (self) {
 
        _view = view;
    }
 
    return self;
}
 
#pragma mark - Enter BG / FG notifications
 
- (void)willEnterForeground:(NSNotification *)note {
 
    self.displayLink.paused = NO;
}
 
- (void)didEnterBackground:(NSNotification *)note {
 
    self.displayLink.paused = YES;
}
 
#pragma mark -
 
- (UIImage *)screenshot {
 
    UIGraphicsBeginImageContextWithOptions(_view.frame.size, YES, 1);
    [_view drawViewHierarchyInRect:_view.bounds afterScreenUpdates:NO];
    UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();
 
    return image;
}
 
- (CIContext *)qb_sharedGPUContext {
    static CIContext *sharedContext;
    static dispatch_once_t onceToken;
    dispatch_once(&onceToken, ^{
        NSDictionary *options = @{
                                  kCIContextPriorityRequestLow: @YES
                                  };
        sharedContext = [CIContext contextWithOptions:options];
    });
    return sharedContext;
}
 
- (void)sendPixelBuffer:(CADisplayLink *)sender {
 
    dispatch_async(self.videoQueue, ^{
 
        @autoreleasepool {
 
            UIImage *image = [self screenshot];
 
            int renderWidth = image.size.width;
            int renderHeight = image.size.height;
 
            CVPixelBufferRef buffer = NULL;
 
            OSType pixelFormatType;
            CFDictionaryRef pixelBufferAttributes = NULL;
            if (kQBRTCUseBiPlanarFormatTypeForShare) {
 
                pixelFormatType = kCVPixelFormatType_420YpCbCr8BiPlanarFullRange;
                pixelBufferAttributes = (__bridge CFDictionaryRef) @
                {
                    (__bridge NSString *)kCVPixelBufferIOSurfacePropertiesKey: @{},
                };
            }
            else {
 
                pixelFormatType = kCVPixelFormatType_32ARGB;
                pixelBufferAttributes = (__bridge CFDictionaryRef) @
                {
                    (NSString *)kCVPixelBufferCGImageCompatibilityKey : @NO,
                    (NSString *)kCVPixelBufferCGBitmapContextCompatibilityKey : @NO
                };
 
            }
 
            CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
                                                  renderWidth,
                                                  renderHeight,
                                                  pixelFormatType,
                                                  pixelBufferAttributes,
                                                  &buffer);
 
            if (status == kCVReturnSuccess && buffer != NULL) {
 
                CVPixelBufferLockBaseAddress(buffer, 0);
 
                if (kQBRTCUseBiPlanarFormatTypeForShare) {
 
                    CIImage *rImage = [[CIImage alloc] initWithImage:image];
                    [self.qb_sharedGPUContext render:rImage toCVPixelBuffer:buffer];
                }
                else {
 
                    void *pxdata = CVPixelBufferGetBaseAddress(buffer);
 
                    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
 
                    uint32_t bitmapInfo = kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst;
 
                    CGContextRef context =
                    CGBitmapContextCreate(pxdata, renderWidth, renderHeight, 8, renderWidth * 4, rgbColorSpace, bitmapInfo);
                    CGContextDrawImage(context, CGRectMake(0, 0, renderWidth, renderHeight), [image CGImage]);
                    CGColorSpaceRelease(rgbColorSpace);
                    CGContextRelease(context);
                }
 
                CVPixelBufferUnlockBaseAddress(buffer, 0);
 
                QBRTCVideoFrame *videoFrame = [[QBRTCVideoFrame alloc] initWithPixelBuffer:buffer
                                                                             videoRotation:QBRTCVideoRotation_0];
 
                [super sendVideoFrame:videoFrame];
            }
 
            CVPixelBufferRelease(buffer);
        }
    });
}
 
#pragma mark - <QBRTCVideoCapture>
 
- (void)didSetToVideoTrack:(QBRTCLocalVideoTrack *)videoTrack {
    [super didSetToVideoTrack:videoTrack];
 
    self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(sendPixelBuffer:)];
    [self.displayLink addToRunLoop:[NSRunLoop mainRunLoop] forMode:NSRunLoopCommonModes];
    self.displayLink.frameInterval = 12; //5 fps
 
    [[NSNotificationCenter defaultCenter] addObserver:self
                                             selector:@selector(willEnterForeground:)
                                                 name:UIApplicationWillEnterForegroundNotification
                                               object:nil];
 
    [[NSNotificationCenter defaultCenter] addObserver:self
                                             selector:@selector(didEnterBackground:)
                                                 name:UIApplicationDidEnterBackgroundNotification
                                               object:nil];
}
 
- (void)didRemoveFromVideoTrack:(QBRTCLocalVideoTrack *)videoTrack {
    [super didRemoveFromVideoTrack:videoTrack];
 
    self.displayLink.paused = YES;
    [self.displayLink removeFromRunLoop:[NSRunLoop mainRunLoop] forMode:NSRunLoopCommonModes];
    self.displayLink = nil;
 
    [[NSNotificationCenter defaultCenter] removeObserver:self
                                                    name:UIApplicationWillEnterForegroundNotification
                                                  object:nil];
 
    [[NSNotificationCenter defaultCenter] removeObserver:self
                                                    name:UIApplicationDidEnterBackgroundNotification
                                                  object:nil];
}
 
@end

To link this capture to your local video track simply use:

//Save previous video capture
self.capture = self.session.localMediaStream.videoTrack.videoCapture;
self.screenCapture = [[QBRTCScreenCapture alloc] initWithView:self.view];
//Switch to sharing
self.session.localMediaStream.videoTrack.videoCapture = self.screenCapture; // here videoTrack calls didSetToVideoTrack:
//Save previous video capture
self.capture = self.session!.localMediaStream.videoTrack.videoCapture
self.screenCapture = QBRTCScreenCapture.init(view: self.view)
//Switch to sharing
self.session!.localMediaStream.videoTrack.videoCapture = self.screenCapture // here videoTrack calls didSetToVideoTrack:

Calling offline users

We made it easy to call offline users.

Quickblox iOS SDK provides methods to notify an application about new events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working push notifications it is very easy to notify users about new call.

- (void)sendPushToOpponentsAboutNewCall {
    NSString *currentUserLogin = [[[QBSession currentSession] currentUser] login];
    [QBRequest sendPushWithText:[NSString stringWithFormat:@"%@ is calling you", currentUserLogin]
               toUsers:[self.session.opponentsIDs componentsJoinedByString:@","]
               successBlock:^(QBResponse * _Nonnull response, NSArray<QBMEvent *> * _Nullable events) {
        NSLog(@"Push sent!");
    } errorBlock:^(QBError * _Nullable error) {
        NSLog(@"Can not send push: %@", error);
    }];
}
func sendPushToOpponentsAboutNewCall() {
 
    let opponentsIDs = (self.session!.opponentsIDs as NSArray).componentsJoinedByString(",")
    let currentUserLogin = QBSession.currentSession().currentUser!.login
 
    QBRequest.sendPushWithText("\(currentUserLogin) is calling you", toUsers: opponentsIDs, successBlock: { (response: QBResponse, event:[QBMEvent]?) in
        print("Push sent!")
    }) { (error:QBError?) in
        print("Can not send push: \(error)")
    }
}

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a push notification.

WebrtcIsCallingYou.jpg

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To start collecting report information do the following:

[QBRTCConfig setStatsReportTimeInterval:5]; // 5 seconds
QBRTCConfig.setStatsReportTimeInterval(5) // 5 seconds

And classes that adopt QBRTCClientDelegate protocol will be notified with

- (void)session:(QBRTCSession *)session updatedStatsReport:(QBRTCStatsReport *)report forUserID:(NSNumber *)userID {
    double audioReceivedBitrate = report.audioReceivedBitrateTracker.bitrate;
    double videoReceivedBitrate = report.videoReceivedBitrateTracker.bitrate;
 
    // works even if mic is disabled( audioTrack.isEnabled == NO)
    NSNumber *audioSendInputLevel = @(report.audioSendInputLevel); 
}
    func session(session: QBRTCSession!, updatedStatsReport report: QBRTCStatsReport!, forUserID userID: NSNumber!) {
 
        let audioReceivedBitrate = report.audioReceivedBitrateTracker.bitrate
        let videoReceivedBitrate = report.videoReceivedBitrateTracker.bitrate
 
        // works even if mic is disabled( audioTrack.isEnabled == false)
        let audioSendInputLevel = report.audioSendInputLevel
    }

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is currently speaking/talking.

You can also use already parsed and readable string that we are providing with most important stats for current report, just use this method:

/**
 *  Parsing all reasonable stats into readable string.
 *
 *  @code
 (cpu)61%
 CN 565ms | local->local/udp | (s)248Kbps | (r)869Kbps
 VS (input) 640x480@30fps | (sent) 640x480@30fps
 VS (enc) 279Kbps/260Kbps | (sent) 200Kbps/292Kbps | 8ms | H264
 AvgQP (past 30 encoded frames) = 36
 VR (recv) 640x480@26fps | (decoded)27 | (output)27fps | 827Kbps/0bps | 4ms
 AS 38Kbps | opus
 AR 37Kbps | opus | 168ms | (expandrate)0.190002
 Packets lost: VS 17 | VR 0 | AS 3 | AR 0
 *  @endcode
 *
 *  @return Readable stats
 */
- (NSString *)statsString;

Settings

You can change different settings for a session

Set answer time interval

If an opponent did not answer you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: 45 seconds

Minimum value: 10 seconds

[QBRTCConfig setAnswerTimeInterval:45];
QBRTCConfig.setAnswerTimeInterval(45)

If user did not respond within the given interval, then a following delegate method will be called

- (void)session:(QBRTCSession *)session userDidNotRespond:(NSNumber *)userID;
func session(session: QBRTCSession!, userDidNotRespond userID: NSNumber!)

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: 5 seconds

Minimum value: 3 seconds

[QBRTCConfig setDialingTimeInterval:5];
QBRTCConfig.setDialingTimeInterval(5)

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

[QBRTCConfig setDTLSEnabled:YES];
QBRTCConfig.setDTLSEnabled(true)

Set custom ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if multiple options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting multiple TURN servers allows your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

NSString *userName = @"quickblox";
NSString *password = @"baccb97ba2d92d71e26eb9886da5f1e0";
 
NSArray *urls = @[
    @"stun:turn.quickblox.com",
    @"turn:turn.quickblox.com:3478?transport=udp",
    @"turn:turn.quickblox.com:3478?transport=tcp"
];
 
QBRTCICEServer *server = [QBRTCICEServer serverWithURLs:urls username:userName password:password];
[QBRTCConfig setICEServers:@[server]];
let username = "quickblox"
let password = "baccb97ba2d92d71e26eb9886da5f1e0"
 
let urls = [
    "stun:turn.quickblox.com",
    "turn:turn.quickblox.com:3478?transport=udp",
    "turn:turn.quickblox.com:3478?transport=tcp"
]
 
let server = QBRTCICEServer.init(urls: urls, username: username, password: password)
QBRTCConfig.setICEServers([server!])

Video codecs: VP8 vs H264

H264 is the most preferred video codec for iOS.

Chrome added support for H264 video codec in 50 revision.

H264 is the only one video codec for iOS that has hardware support.

Video quality

1. Video quality depends on hardware you use. iPhone 4s will not handle FullHD rendering. But iPhone 6+ will.

2. Video quality depends on network you use and how many connections you have.

For multi-calls set lower video quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

/**
 *  Retrieve available array of QBRTCVideoFormat instances for given camera position.
 *
 *  @param position requested camera position
 *
 *  @return Array of possible QBRTCVideoFormat video formats for requested position
 */
+ (NSArray <QBRTCVideoFormat *> *)formatsWithPosition:(AVCaptureDevicePosition)position;

WebRTC has auto scaling of video resolution and quality to keep network connection active.

To get best quality and performance you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

2. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

Opus
In the latest versions of Firefox and Chrome this codec is used by default for encoding audio streams. This codec is relatively new (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from 6 kbit/s to 510 kbit/s Supported sampling rates: from 8 kHz to 48 kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

iSAC
This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From 10 kbit/s to 52 kbit/s. Supported sampling rates: 32 kHz

Good for voice data, but not as good as OPUS.

iLBC

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in 2011 when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you definitely should try iLBC — it should be strong on such cases.

Supported bitrates: fixed bitrate. 15.2 kbit/s or 13.33 kbit/s Supported sampling rate: 8 kHz

Conclusion
When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks - use iSAC. If you still have problems – try iLBC.

Enable specified audio codec

QBRTCMediaStreamConfiguration *conf = [QBRTCMediaStreamConfiguration defaultConfiguration] 
conf.audioCodec = QBRTCAudioCodeciLBC;
[QBRTCConfig setMediaStreamConfiguration:conf];
let conf = QBRTCMediaStreamConfiguration.defaultConfiguration()
conf.audioCodec = QBRTCAudioCodec.CodeciLBC
QBRTCConfig.setMediaStreamConfiguration(conf)

Framework changelog

v2.4.1 - Feb 22, 2017

  • Multiple renderers support


v2.4 - Jan 27, 2017

  • WebRTC r 16262
  • [Framework]
    • Built as dynamic framework.
      • Run Script Phase: Don't forget to add a Run Script Phase to your project, in order for framework to successfully compile when archiving.
      • Embedded binary: If you are not using CocoaPods and have QuickbloxWebRTC installed manually, don't forget to add it as Embedded binary.
        • Developer comment: From now on framework is built and distributed as dynamic framework. You can read more about it here. This will also allow the use of use_frameworks! flag in pod file, that is using QuickbloxWebRTC framework as one of its pods.
    • Added full bitcode support.
      • Developer comment: QuickbloxWebRTC framework is now fully supports bitcode. You can enable bitcode in your project (if no other framework you are using don't have it) and we are highly encouraging you to do it. You can read more about bitcode here.
    • Reduced size of our framework from 84.3 mb to 67.4 mb.
      • Developer comment: Bitcode support and dynamic framework itself have allowed us to reduce our framework size in ~21%.
  • [Core]
    • Fixed - (void)session:(QBRTCSession *)session hungUpByUser:(NSNumber *)userID userInfo:(NSDictionary<NSString *,NSString *> *)userInfo delegate method of QBRTCClientDelegate protocol not being called (thanks to neshyurik for reporting the issue).
    • Fixed crash after accepting the incoming call that was cancelled by the caller.
  • Sample video chat.
    • Improved screen sharing codebase in QBRTCScreenCapture class. Now using kCVPixelFormatType_420YpCbCr8BiPlanarFullRange pixel format (better performance).
    • Reduced size of icons in sample.


v2.3.1 - Jan 4, 2017
  • Fixed TURN/STUN Servers setup. Thanks GiacomoSilli for reporting the issue.
  • Added missing RTCVideoFrame.h file


v2.3 - Dec 30, 2016
  • WebRTC r 15791
  • Dropped iOS 7 support.
  • Added generics and nullability support to the project (Xcode 7+).
  • Deprecated QBRTCSoundRouter class. You should now use QBRTCAudioSession class instead.
    • Developer comment: Interface remained mostly same, but with new features and lots of bugs fixed. Check out Audio Session section for more information and guidelines. QBRTCAudioSession fixes lots of bugs that QBRTCSoundRouter had. Thats includes: bluetooth specific bugs and problems, bad sounds, headphones transition, etc.
  • Deprecated setDisconnectTimeInterval: and disconnectTimeInterval in QBRTCConfig class. No longer in use due to updated WebRTC specification.
  • defaultConfiguration instance of QBRTCMediaStreamConfiguration will now set H264 codec for video as default (was VP8 previously).
    • Developer comment: H264 is a hardware supported video codec on iOS, but was only working on iOS 8+. As we have dropped iOS 7 no need to set VP8 as default from now on. VP8 will still be set automatically if any of the opponent devices does not support H264 and, of course, you can still set as preferred if needed.
  • QBRTCCameraCapture class:
    • Deprecated startSession method. Use startSession: instead (with completion block).
    • Deprecated stopSession method. Use stopSession: instead (with completion block).
    • Deprecated stopSessionAndTeardownOutputs: method. Use stopSession: instead (with completion block).
    • Deprecated selectCameraPosition:. Use setPosition: instead.
    • Deprecated currentPosition. Use position instead.
    • Added hasStarted boolean property. Determines whether camera capture has started, but is not running yet (in set-up state).
    • Added isRunning boolean property. Determines whether capture session is running.
  • Deprecated initWithPixelBuffer: in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.
  • QBRTCSession class:
    • Added currentUserID NSNumber property. Returns current user ID, mapped into NSNumber class.
    • Added state QBRTCSessionState property. Represents current state of session. Might be: new, pending, connecting, connected and closed. Check out QBRTCSessionState for more information.
    • Added generics for all collections used in current class.
    • Added remoteAudioTrackWithUserID: method. Use it to get remote audio track for a specific opponent user ID.
      • Developer comment: You can use this method to get a specific opponent audio track by his user ID. Example: you can mute a specific user in session.
  • QBRTCClientDelegate protocol:
    • Added generics for every collection class that is used.
    • Added session:didChangeState: delegate. Called when session state has been changed. Use this to track a session state. As SDK 2.3 introduced states for session, you can now manage your own states based on this.
    • Deprecated session:initializedLocalMediaStream: delegate. Local media stream is initialized with session initialization and can now be configured from the beginning without need of waiting for this delegate.
      • Developer comment: Previously we were forced to wait for this delegate in order to do any manipulations with our local media stream (for example to mute our audio or video, or add video capture to video track). From now on it is possible to do right after the session initialization.
    • Deprecated session:disconnectedByTimeoutFromUser: delegate as disconnectTimeInterval is no longer in use due to WebRTC specification.
    • Added session:receivedRemoteAudioTrack delegate. Called when received remote audio track from user.
    • Added session:didChangeConnectionState: delegate. Called when session connection state changed for a specific user.
  • Added statsString method in QBRTCStatsReport class. Use it to get a readable string of call stats.
    • Developer comment: Previously you would need to parse all stats returned by session:updatedStatsReport:forUserID: delegate by yourself. It was not the easiest job to do. From now on you can just call statsString method to get all important stats with description as NSString string.
  • Introducing UIDevice+QBPerformance category. This category was designed to help you understand, what performance your device can handle by our standards. Use it to check whether your device is in low or medium device category. This should be helpful to determine whether you need to hardcode low/medium resolutions and low/medium audio codec in order to let device perform well without any problems.
    • qbrtc_lowPerformance boolean property. Determines whether device is in low performance category and should be treated like that.
    • qbrtc_mediumPerformance boolean property. Determines whether device is in medium performance category and should be treated like that.
    • Full list of such devices available here.
  • QBRTCRemoteVideoView class:
    • Remote video view is now based on OpenGL only.
    • Added videoGravity property. Options are AVLayerVideoGravityResizeAspect, AVLayerVideoGravityResizeAspectFill and AVLayerVideoGravityResize. AVLayerVideoGravityResizeAspect is default. See <AVFoundation/AVAnimation.h> for a description of these options.
    • Implemented QBRTCRemoteVideoViewDelegate protocol with videoView:didChangeVideoSize: method. Called when video view size was changed.
  • Deprecated old QBRTCConnectionState enum names (were missing State in their name) and added new ones, that conforms to QBRTCConnectionState name.


v2.2 — March 15, 2016
  • WebRTC r 11951
  • Fixed Video capture session restoring after background if H264 was used.
  • Added Bluetooth support in QBRTCSoundRouter.
  • QBRTCSoundRouter has been rearchitectured.
  • We improved the speed of switching between Speaker/Receiver/Headset/Bluetooth devices.
  • QBRTCSoundRouter automatically switches to a headset or Bluetooth device if available.
  • Headset has the highest priority, then Bluetooth, then speaker/receiver.

To check if bluetooth device is available use

[[QBRTCSoundRouter instance] isBluetoothPluggedIn]
QBRTCSoundRouter.instance().isBluetoothPluggedIn


v2.1.1 — November 19, 2015

WebRTC r 10677
Fixed Xcode build warnings
.
Fixed compiler warnings.
Small bug fixes.


v2.1 — November 17, 2015

WebRTC r 10677
Added experimental webrtc stats reporting( see QBRTCStatsReport ), will be refactored next release.
Fixed compiler warnings.
Small bug fixes.


v2.0 — November 4, 2015

WebRTC r 10505

1. Fixed performance issues on iPhone 4s, improved stability on multi-calls
2. Improved stability at low internet speed connection
3. Added support for H264 hardware video codec on iOS 8+
4. Added custom renderers and custom capture to send your custom frames
5. From this version you are able to configure:

Video

  • Quality, pixel format, frames per second (FPS) and bandwidth
  • Choose whether to use VP8 or H264 video codec

Audio

  • Quality and bandwidth
  • Choose Opus, ISAC or iLBC audio codec

6. Sample-video-chat rearchitecture
7. Removed local video track
8. Added remote video track (see QBRTCRemoteVideoView)
9. Full support of AVCaptureSession
10. Improved perfomance in rendering local video track


v1.0.6 — June 17 , 2015
  • WebRTC r 9446
  • Support Quickblox.framework v 2.3


v1.0.5 — June 15 , 2015
  • Added iOS simulator 64 bit
  • Fixed crash (ISAC for armv7 devices)
  • WebRTC r 9437
  • Added #import <AVFoundation/AVFoundation.h> to QBRTCSession


v1.0.4 — May 20 , 2015
  • Remove deprecated methods
  • Updated Background mode
  • WebRTC r 9234
  • added captureSession field to QBRTCSession


v1.0.3 — April 15 , 2015
  • Stability improvement
  • WebRTC r 9004
  • Added captureSession field to QBRTCSession
  • Decreased SDK binary size


v1.0.2 — March 17 , 2015
  • Stability improvement
  • WebRTC r 8729
  • added audioCategoryOptions field to QBRTCSession
  • added skipBlackFrames field to QBGLVideoView (experimental, deprecated in since 1.0.6)
  • Fixes for switch camera]


v1.0.1 — Feb 27, 2015
  • WebRTC r8442
  • Enable / Disable Datagram Transport Layer Security +[QBRTCConfig setDTLSEnabled:]