Ask tech team
From QuickBlox Developers (API docs, code samples, SDK)
(Redirected from SimpleSample-videochat-ios)
Jump to: navigation, search



Project homepage on GIT —

Download ZIP -


The VideoChat code sample allows you to easily add video calling and audio calling features into your iOS app. Enable a video call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our new feature of QuickbloxWebRTC SDKScreen sharing

User List.png Video Call.png Incoming Call.png

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.3 (pod QuickBlox)
    • iPhone 4S+.
    • iPad 2+.
    • iPod Touch 5+.
    • iOS 7+.
    • iOS simulator 32/64 bit
    • Wi-Fi and 4G/LTE connections.

Getting Started with Video Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following commands in

$ sudo gem install cocoapods
$ pod setup

Step 2: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

$ touch Podfile
$ open -e Podfile

TextEdit should open showing an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

source ''
platform :ios, '7.0'
pod 'Quickblox-WebRTC', '~> 2.2'
pod 'QuickBlox'

Step 3: Installing Dependencies

Now you can install the dependencies in your project:

$ pod install

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

$ open ProjectName.xcworkspace

Step 4: Importing Headers

At this point, everything is in place for you to start using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

#import <SystemConfiguration/SystemConfiguration.h>
#import <MobileCoreServices/MobileCoreServices.h>
#import <Quickblox/Quickblox.h>
#import <QuickbloxWebRTC/QuickbloxWebRTC.h>

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Install Quickblox iOS SDK


Step 2: Add the framework to your Xcode Project

Drag the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination's group folder" checkbox is checked.

Source tree.png

Step 3: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Webrtc build phase screen.png

Step 4: Importing Headers

At this point, everything is in place for you to start using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

#import <SystemConfiguration/SystemConfiguration.h>
#import <MobileCoreServices/MobileCoreServices.h>
#import <Quickblox/Quickblox.h>
#import <QuickbloxWebRTC/QuickbloxWebRTC.h>

Life cycle

// Initialize QuickbloxWebRTC and configure signaling
// You should call this method before any interact with QuickbloxWebRTC
[QBRTCClient initializeRTC];
// Call this method when you finish your work with QuickbloxWebRTC
[QBRTCClient deinitializeRTC];

Call users

To call users just use this method:

[QBRTCClient.instance addDelegate:self]; // self class must conform to QBRTCClientDelegate protocol
// 2123, 2123, 3122 - opponent's
NSArray *opponentsIDs = @[@3245, @2123, @3122];
QBRTCSession *newSession = [QBRTCClient.instance createNewSessionWithOpponents:opponentsIDs
// userInfo - the custom user information dictionary for the call. May be nil.
NSDictionary *userInfo = @{ @"key" : @"value" };
[newSession startCall:userInfo];

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per 5 second for a duration of 45 seconds (you can configure these settings with QBRTCConfig):

#pragma mark -
#pragma mark QBRTCClientDelegate
- (void)didReceiveNewSession:(QBRTCSession *)session userInfo:(NSDictionary *)userInfo {
    if (self.session) {
        // we already have a video/audio call session, so we reject another one
        // userInfo - the custom user information dictionary for the call from caller. May be nil.
        NSDictionary *userInfo = @{ @"key" : @"value" };
        [session rejectCall:userInfo];
    self.session = session;

self.session - this refers to this session. Each particular audio - video call has a unique sessionID. This allows you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to 60 seconds:

[QBRTCConfig setAnswerTimeInterval:60];

Accept a call

To accept a call request just use this method:

// userInfo - the custom user information dictionary for the accept call. May be nil.
NSDictionary *userInfo = @{ @"key" : @"value" };
[self.session acceptCall:userInfo];

After this your opponent will receive an accept signal:

#pragma mark -
#pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session acceptedByUser:(NSNumber *)userID userInfo:(NSDictionary *)userInfo {

Reject a call

To reject a call request just use this method:

// userInfo - the custom user information dictionary for the reject call. May be nil.
NSDictionary *userInfo = @{ @"key" : @"value" };
[self.session rejectCall:userInfo];
// and release session instance
self.session = nil;

After this your opponent will receive a reject signal:

#pragma mark -
#pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session rejectedByUser:(NSNumber *)userID userInfo:(NSDictionary *)userInfo  {
    NSLog(@"Rejected by user %@", userID);

Connection life-cycle

Called when local media stream has successfully initialized itself and configured tracks: called after startCall: or accept: methods

After initializing you are able to set a video capture

#pragma mark -
#pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session initializedLocalMediaStream:(QBRTCMediaStream *)mediaStream {
    NSLog(@"Initialized local media stream %@", mediaStream);
    mediaStream.videoTrack.videoCapture = self.capture;

Called when connection is initiated with user:

#pragma mark -
#pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session startedConnectingToUser:(NSNumber *)userID {
    NSLog(@"Started connecting to user %@", userID);

Called when connection is closed for user

#pragma mark -
#pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session connectionClosedForUser:(NSNumber *)userID {
    NSLog(@"Connection is closed for user %@", userID);

Called in case when connection is established with user:

#pragma mark -
#pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session connectedToUser:(NSNumber *)userID {
    NSLog(@"Connection is established with user %@", userID);

Called in case when user is disconnected:

#pragma mark -
#pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session disconnectedFromUser:(NSNumber *)userID {
    NSLog(@"Disconnected from user %@", userID);
// use [QBRTCConfig setDisconnectTimeInterval:value] to set disconnect time interval
- (void)session:(QBRTCSession *)session disconnectedByTimeoutFromUser:(NSNumber *)userID {
    // use [QBRTCConfig setDisconnectTimeInterval:value] to set disconnect time interval
    NSLog(@"Disconnected from user %@ by timeout", userID);

Called in case when user did not respond to your call within timeout .
note: use +[QBRTCConfig setAnswerTimeInterval:value] to set answer time interval

#pragma mark -
#pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session userDidNotRespond:(NSNumber *)userID {
    NSLog(@"User %@ did not respond to your call within timeout", userID);

Called in case when connection failed with user.

#pragma mark -
#pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session connectionFailedForUser:(NSNumber *)userID {
    NSLog(@"Connection has failed with user %@", userID);

Manage remote video tracks

In order to show video views with streams which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

#pragma mark -
#pragma mark QBRTCClientDelegate
//Called in case when receive remote video track from opponent
- (void)session:(QBRTCSession *)session receivedRemoteVideoTrack:(QBRTCVideoTrack *)videoTrack fromUser:(NSNumber *)userID {
    // we suppose you have created UIView and set it's class to QBRTCRemoteVideoView class
    // also we suggest you to set view mode to UIViewContentModeScaleAspectFit or
    // UIViewContentModeScaleAspectFill
    [self.opponentVideoView setVideoTrack:videoTrack];

Manage local video track

In order to show your local video track from camera you should create UIView on storyboard and then use the following code:

// your view controller interface code
@interface CallController()
@property (weak, nonatomic) IBOutlet UIView *localVideoView; // your video view to render local camera video stream
@property (strong, nonatomic) QBRTCCameraCapture *videoCapture;
@implementation CallController
- (void)viewDidLoad {
    [super viewDidLoad];
    [QBRTCClient.instance addDelegate:self];
    QBRTCVideoFormat *videoFormat = [[QBRTCVideoFormat alloc] init];
    videoFormat.frameRate = 30;
    videoFormat.pixelFormat = QBRTCPixelFormat420f;
    videoFormat.width = 640;
    videoFormat.height = 480;
    // QBRTCCameraCapture class used to capture frames using AVFoundation APIs
    self.videoCapture = [[QBRTCCameraCapture alloc] initWithVideoFormat:videoFormat position:AVCaptureDevicePositionFront]; // or AVCaptureDevicePositionBack
    self.videoCapture.previewLayer.frame = self.localVideoView.bounds;
    [self.videoCapture startSession];
    [self.localVideoView.layer insertSublayer:self.videoCapture.previewLayer atIndex:0];
   // start call

Hang up

To hang a up call:

NSDictionary *userInfo = @{ @"key" : @"value" }
[self.session hangUp:userInfo];

After this your opponent's will receive a hangUp signal

#pragma mark -
#pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session hungUpByUser:(NSNumber *)userID userInfo:(NSDictionary *)userInfo {
    //For example:Update GUI
    // Or
        HangUp when initiator ended a call
    if ([session.initiatorID isEqualToNumber:userID]) {
        [session hangUp:@{}];

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

#pragma mark -
#pragma mark QBRTCClientDelegate
- (void)sessionDidClose:(QBRTCSession *)session {
    // release session instance
    self.session = nil;

Disable / enable audio stream

You can disable / enable the audio stream during a call:

self.session.localMediaStream.audioTrack.enabled = !self.session.localMediaStream.audioTrack.isEnabled;

Please note: due to webrtc restrictions silence will be placed into stream content if audio is disabled.

Disable / enable video stream

You can disable / enable the video stream during a call:

self.session.localMediaStream.videoTrack.enabled = !self.session.localMediaStream.videoTrack.isEnabled;

Please note: due to webrtc restrictions black frames will be placed into stream content if video is disabled.

Switch camera

You can switch the video capture position during a call (Default: front camera):

'videoCapture' below is QBRTCCameraCapture described in CallController above

// to set default (preferred) camera position
- (void)viewDidLoad {
    [super viewDidLoad];
    QBRTCVideoFormat *videoFormat = [[QBRTCVideoFormat alloc] init];
    videoFormat.frameRate = 30;
    videoFormat.pixelFormat = QBRTCPixelFormat420f;
    videoFormat.width = 640;
    videoFormat.height = 480;
    self.videoCapture = [[QBRTCCameraCapture alloc] initWithVideoFormat:videoFormat position:AVCaptureDevicePositionFront]; // or AVCaptureDevicePositionBack
// to change some time after, for example, at the moment of call
AVCaptureDevicePosition position = [self.videoCapture currentPosition];
AVCaptureDevicePosition newPosition = position == AVCaptureDevicePositionBack ? AVCaptureDevicePositionFront : AVCaptureDevicePositionBack;
// check whether videoCapture has or has not camera position
// for example, some iPods do not have front camera 
if ([self.videoCapture hasCameraForPosition:newPosition]) {
    [self.videoCapture selectCameraPosition:newPosition];

Sound router

//Save current audio configuration before start call or accept call
[QBRTCSoundRouter.instance initialize];
//Set headphone
QBRTCSoundRouter.instance.currentSoundRoute = QBRTCSoundRouteReceiver;
//or set speaker
QBRTCSoundRouter.instance.currentSoundRoute = QBRTCSoundRouteSpeaker;
//deinitialize after session close 
[QBRTCSoundRouter.instance deinitialize];

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is important not to skip this step.

There is also a UI for setting app background modes in XCode 5. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

Xcode5 capabilities.png

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a red background of the status bar, as well as an additional bar indicating the name of the app holding the active audio session — in this case, your app.

Bg mode.png

Screen sharing

We are happy to introduce you a new feature of QuickbloxWebRTC SDK — Screen sharing.

I can see it.jpg I see you.jpg

Screen sharing allows you to share information from your application to all of your opponents.

It gives you an ability to promote your product, share a screen with formulas to students, distribute podcasts, share video/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the ability to create custom video capture.

Video capture is a base class you should inherit from in order to send frames you your opponents.

Custom video capture

QBRTCVideoCapture class allows to send frames to your opponents.

By inheriting this class you are able to provide custom logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom video capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that allows your application to synchronize its drawing to the refresh rate of the display.

For full source code of custom capture and additional methods please refer to sample-videochat-webrtc sample

@interface QBRTCScreenCapture()
@property (nonatomic, weak) UIView *view; // screenshots are formed by grabbing content of this view
@property (strong, nonatomic) CADisplayLink *displayLink;
@implementation QBRTCScreenCapture
- (instancetype)initWithView:(UIView *)view {    
    self = [super init]; // super inits serial videoQueue
    if (self) {
        _view = view;
    return self;
// Grab content of the view and return formed screenshot
- (UIImage *)screenshot {
    UIGraphicsBeginImageContextWithOptions(_view.frame.size, NO, 1);
    [_view drawViewHierarchyInRect:_view.bounds afterScreenUpdates:NO];
    UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
    return image;
// QBRTCVideoSource calls this method of our video capture when set
- (void)didSetToVideoTrack:(QBRTCLocalVideoTrack *)videoTrack {
    [super didSetToVideoTrack:videoTrack];
    self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(sendPixelBuffer:)];
    [self.displayLink addToRunLoop:[NSRunLoop mainRunLoop] forMode:NSRunLoopCommonModes];
    self.displayLink.frameInterval = 12; //5 fps
- (void)sendPixelBuffer:(CADisplayLink *)sender {
    //Convert to unix nanosec
    int64_t timeStamp = sender.timestamp * NSEC_PER_SEC;
    dispatch_async(self.videoQueue, ^{
        @autoreleasepool {
            UIImage *image = [self screenshot];
            int w = image.size.width;
            int h = image.size.height;
            NSDictionary *options = @{
                                      (NSString *)kCVPixelBufferCGImageCompatibilityKey : @NO,
                                      (NSString *)kCVPixelBufferCGBitmapContextCompatibilityKey : @NO
            CVPixelBufferRef pixelBuffer = nil;
            CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
                                                  (__bridge CFDictionaryRef)(options),
            if(status == kCVReturnSuccess && pixelBuffer != NULL) {
                CVPixelBufferLockBaseAddress(pixelBuffer, 0);
                void *pxdata = CVPixelBufferGetBaseAddress(pixelBuffer);
                CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
                uint32_t bitmapInfo = kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst;
                CGContextRef context =
                CGBitmapContextCreate(pxdata, w, h, 8, w * 4, rgbColorSpace, bitmapInfo);
                CGContextDrawImage(context, CGRectMake(0, 0, w, h), [image CGImage]);
                QBRTCVideoFrame *videoFrame = [[QBRTCVideoFrame alloc] initWithPixelBuffer:pixelBuffer];
                videoFrame.timestamp = timeStamp;
                [super sendVideoFrame:videoFrame];
                CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);

To link this capture to your local video track simply use:

//Save previous video capture
self.capture = self.session.localMediaStream.videoTrack.videoCapture;
self.screenCapture = [[QBRTCScreenCapture alloc] initWithView:self.view];
//Switch to sharing
self.session.localMediaStream.videoTrack.videoCapture = self.screenCapture; // here videoTrack calls didSetToVideoTrack:

Calling offline users

We made it easy to call offline users.

Quickblox iOS SDK provides methods to notify an application about new events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working push notifications it is very easy to notify users about new call.

- (void)sendPushToOpponentsAboutNewCall {
    NSString *currentUserLogin = [[[QBSession currentSession] currentUser] login];
    [QBRequest sendPushWithText:[NSString stringWithFormat:@"%@ is calling you", currentUserLogin]
               toUsers:[self.session.opponentsIDs componentsJoinedByString:@","]
               successBlock:^(QBResponse * _Nonnull response, NSArray<QBMEvent *> * _Nullable events) {
        NSLog(@"Push sent!");
    } errorBlock:^(QBError * _Nullable error) {
        NSLog(@"Can not send push: %@", error);

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a push notification.


If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To start collecting report information do the following:

[QBRTCConfig setStatsReportTimeInterval:5]; // 5 seconds

And classes that adopt QBRTCClientDelegate protocol will be notified with

- (void)session:(QBRTCSession *)session updatedStatsReport:(QBRTCStatsReport *)report forUserID:(NSNumber *)userID {
    double audioReceivedBitrate = report.audioReceivedBitrateTracker.bitrate;
    double videoReceivedBitrate = report.videoReceivedBitrateTracker.bitrate;
    // works even if mic is disabled( audioTrack.isEnabled == NO)
    NSNumber *audioSendInputLevel = @(report.audioSendInputLevel); 

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is currently speaking/talking.


You can change different settings for a session

Set answer time interval

If an opponent did not answer you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: 45 seconds

Minimum value: 10 seconds

[QBRTCConfig setAnswerTimeInterval:45];

If user did not respond within the given interval, then a following delegate method will be called

- (void)session:(QBRTCSession *)session userDidNotRespond:(NSNumber *)userID;

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: 5 seconds

Minimum value: 3 seconds

[QBRTCConfig setDialingTimeInterval:5];

Set disconnect time interval

After a disconnect from an opponent happened we are starting timer and waiting for a given time interval in case connection establishing/reconnecting again

Default value: 30 seconds

[QBRTCConfig setDisconnectTimeInterval:30];

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

[QBRTCConfig setDTLSEnabled:YES];

Set custom ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia is used, but you can add/setup more: if you are located in Asia and if you are located in Europe.

How does WebRTC select which TURN server to use if multiple options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting multiple TURN servers allows your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

- (NSArray *)quickbloxICE {
    NSString *password = @"baccb97ba2d92d71e26eb9886da5f1e0";
    NSString *userName = @"quickblox";
    NSArray *urls = @[
                      @"",       //USA
                      @"",   //Singapore
                      @""      //Ireland
    NSMutableArray *result = [NSMutableArray arrayWithCapacity:urls.count];
    for (NSString *url in urls) {
        QBRTCICEServer *stunServer = [QBRTCICEServer serverWithURL:[NSString stringWithFormat:@"stun:%@", url]
        QBRTCICEServer *turnUDPServer = [QBRTCICEServer serverWithURL:[NSString stringWithFormat:@"turn:%@:3478?transport=udp", url]
        QBRTCICEServer *turnTCPServer = [QBRTCICEServer serverWithURL:[NSString stringWithFormat:@"turn:%@:3478?transport=tcp", url]
        [result addObjectsFromArray:@[stunServer, turnTCPServer, turnUDPServer]];
    return result;
[QBRTCConfig setICEServers:[self quickbloxICE]];

Video codecs: VP8 vs VP9 vs H264

H264 is the most preferred video codec for iOS.

Chrome added support for H264 video codec in 50 revision.

VP9 for iOS exists only in development and webrtc doesn't have a stable version, so we are waiting for a stable one.

VP8 should be used if you do support iOS7 because iOS7 does not have H264 hardware support.

H264 is the only one video codec for iOS that has hardware support.

Video quality

1. Video quality depends on hardware you use. iPhone 4s will not handle FullHD rendering. But iPhone 6+ will.

2. Video quality depends on network you use and how many connections you have.

For multi-calls set lower video quality. For peer-to-peer calls you can set higher quality.

WebRTC has auto scaling of video resolution and quality to keep network connection active.

To get best quality and performance you should use H264.

1. If you do support iOS 7, then WebRTC automatically switches to VP8 even if you set H264

2. If some android devices do not support H264, then automatically VP8 will be used

3. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio streams. This codec is relatively new (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from 6 kbit/s to 510 kbit/s Supported sampling rates: from 8 kHz to 48 kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From 10 kbit/s to 52 kbit/s. Supported sampling rates: 32 kHz

Good for voice data, but not as good as OPUS.


This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in 2011 when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you definitely should try iLBC — it should be strong on such cases.

Supported bitrates: fixed bitrate. 15.2 kbit/s or 13.33 kbit/s Supported sampling rate: 8 kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks - use iSAC. If you still have problems – try iLBC.

Enable specified audio codec

QBRTCMediaStreamConfiguration *conf = [QBRTCMediaStreamConfiguration defaultConfiguration] 
conf.audioCodec = QBRTCAudioCodeciLBC;
[QBRTCConfig setMediaStreamConfiguration:conf];

Framework changelog

v2.2 — March 15, 2016

  • WebRTC r 11951
  • Fixed Video capture session restoring after background if H264 was used.
  • Added Bluetooth support in QBRTCSoundRouter.
  • QBRTCSoundRouter has been rearchitectured.
  • We improved the speed of switching between Speaker/Receiver/Headset/Bluetooth devices.
  • QBRTCSoundRouter automatically switches to a headset or Bluetooth device if available.
  • Headset has the highest priority, then Bluetooth, then speaker/receiver.

To check if bluetooth device is available use

[[QBRTCSoundRouter instance] isBluetoothPluggedIn]

v2.1.1 — November 19, 2015

WebRTC r 10677
Fixed Xcode build warnings
Fixed compiler warnings.
Small bug fixes.

v2.1 — November 17, 2015

WebRTC r 10677
Added experimental webrtc stats reporting( see QBRTCStatsReport ), will be refactored next release.
Fixed compiler warnings.
Small bug fixes.

v2.0 — November 4, 2015

WebRTC r 10505

1. Fixed performance issues on iPhone 4s, improved stability on multi-calls
2. Improved stability at low internet speed connection
3. Added support for H264 hardware video codec on iOS 8+
4. Added custom renderers and custom capture to send your custom frames
5. From this version you are able to configure:


  • Quality, pixel format, frames per second (FPS) and bandwidth
  • Choose whether to use VP8 or H264 video codec


  • Quality and bandwidth
  • Choose Opus, ISAC or iLBC audio codec

6. Sample-video-chat rearchitecture
7. Removed local video track
8. Added remote video track (see QBRTCRemoteVideoView)
9. Full support of AVCaptureSession
10. Improved perfomance in rendering local video track

v1.0.6 — June 17 , 2015
  • WebRTC r 9446
  • Support Quickblox.framework v 2.3

v1.0.5 — June 15 , 2015
  • Added iOS simulator 64 bit
  • Fixed crash (ISAC for armv7 devices)
  • WebRTC r 9437
  • Added #import <AVFoundation/AVFoundation.h> to QBRTCSession

v1.0.4 — May 20 , 2015
  • Remove deprecated methods
  • Updated Background mode
  • WebRTC r 9234
  • added captureSession field to QBRTCSession

v1.0.3 — April 15 , 2015
  • Stability improvement
  • WebRTC r 9004
  • Added captureSession field to QBRTCSession
  • Decreased SDK binary size

v1.0.2 — March 17 , 2015
  • Stability improvement
  • WebRTC r 8729
  • added audioCategoryOptions field to QBRTCSession
  • added skipBlackFrames field to QBGLVideoView (experimental, deprecated in since 1.0.6)
  • Fixes for switch camera]

v1.0.1 — Feb 27, 2015
  • WebRTC r8442
  • Enable / Disable Datagram Transport Layer Security +[QBRTCConfig setDTLSEnabled:]