Sample-webrtc-ios newest rtc ios guide

Ask tech team
From QuickBlox Developers (API docs, code samples, SDK)
Jump to: navigation, search


Contents

Sources

Project homepage on GIT — https://github.com/QuickBlox/quickblox-ios-sdk/tree/master/sample-videochat-webrtc

Download ZIP - https://github.com/QuickBlox/quickblox-ios-sdk/archive/master.zip

Download iOS WebRTC SDK

QuickbloxWebRTC iOS SDK, version 1.0.6


Overview

The VideoChat code sample allows you to easily add video calling and audio calling features into your iOS app. Enable a video call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

User List.png Video Call.png Incoming Call.png

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.3
    • iPhone 4S+.
    • iPad 2+.
    • iPod Touch 5+.
    • iOS 7+.
    • iOS simulator 32/64 bit
    • Wi-Fi and 4G/LTE connections.

Getting Started

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following commands in Terminal.app:

$ sudo gem install cocoapods
$ pod setup

Step 2: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

$ touch Podfile
$ open -e Podfile

TextEdit should open showing an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

source 'https://github.com/CocoaPods/Specs.git'
platform :ios, '7.0'
pod 'Quickblox-WebRTC', '~> 1.0.6'

Step 3: Installing Dependencies

Now you can install the dependencies in your project:

$ pod install

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

$ open ProjectName.xcworkspace

Step 4: Importing Headers

At this point, everything is in place for you to start using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

#import <SystemConfiguration/SystemConfiguration.h>
#import <MobileCoreServices/MobileCoreServices.h>
#import <Quickblox/Quickblox.h>
#import <QuickbloxWebRTC/QuickbloxWebRTC.h>

Add the Framework to your Xcode Project

Step 1: Download & unzip the Framework

QuickbloxWebRTC.framework

Step 2: Add the framework to your Xcode Project

Drag the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination's group folder" checkbox is checked.

Source tree.png

Step 3: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework


Webrtc build phase screen.png

Step 4: Importing Headers

At this point, everything is in place for you to start using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

#import <SystemConfiguration/SystemConfiguration.h>
#import <MobileCoreServices/MobileCoreServices.h>
#import <Quickblox/Quickblox.h>
#import <QuickbloxWebRTC/QuickbloxWebRTC.h>

Life cycle

// Initialize QuickbloxWebRTC and configure signaling
// You should call this method before any interact with QuickbloxWebRTC
[QBRTCClient initializeRTC];
 
// Call this method when you finish your work with QuickbloxWebRTC
[QBRTCClient deinitializeRTC];

Settings

You can set presets for session:

// Set answer time interval
[QBRTCConfig setAnswerTimeInterval:60];
// Set dialing time interval
[QBRTCConfig setDialingTimeInterval:5];
// Set disconnect time interval
[QBRTCConfig setDisconnectTimeInterval:15];
// Enable DTLS (Datagram Transport Layer Security)
[QBRTCConfig setDTLSEnabled:YES];
 
// Set custom ICE servers
 NSURL *stunUrl = [NSURL URLWithString:@"stun:turn.quickblox.com"];
 QBRTCICEServer *stunServer = 
 [QBRTCICEServer serverWithURL:stunUrl username:@"quickblox" password:@"baccb97ba2d92d71e26eb9886da5f1e0"];
 
 NSURL *turnUDPUrl = [NSURL URLWithString:@"turn:turn.quickblox.com:3478?transport=udp"];
 QBRTCICEServer *turnUDPServer =
 [QBRTCICEServer serverWithURL:turnUDPUrl username:@"quickblox" password:@"baccb97ba2d92d71e26eb9886da5f1e0"];
 
 NSURL *turnTCPUrl = [NSURL URLWithString:@"turn:turn.quickblox.com:3478?transport=tcp"];
 QBRTCICEServer* turnTCPServer =
 [QBRTCICEServer serverWithURL:turnTCPUrl username:@"quickblox" password:@"baccb97ba2d92d71e26eb9886da5f1e0"];
 
 [QBRTCConfig setICEServers:@[stunServer, turnUDPServer, turnTCPServer]];

Call users

To call users just use this method:

[QBRTCClient.instance addDelegate:self]; // self class must conform to QBRTCClientDelegate protocol
 
// 2123, 2123, 3122 - opponent's
NSArray *opponentsIDs = @[@3245, @2123, @3122];
QBRTCSession *newSession = [QBRTCClient.instance createNewSessionWithOpponents:opponentsIDs
                                                            withConferenceType:QBRTCConferenceTypeVideo];
// userInfo - the custom user information dictionary for the call. May be nil.
NSDictionary *userInfo = @{ @"key" : @"value" };
[newSession startCall:userInfo];

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per 5 second for a duration of 45 seconds (you can configure these settings with TODO:fix the link below QBRTCConfig):

#pragma mark -
#pragma mark QBRTCClientDelegate
 
- (void)didReceiveNewSession:(QBRTCSession *)session userInfo:(NSDictionary *)userInfo {
 
    if (self.session) {
        // we already have a video/audio call session, so we reject another one
        // userInfo - the custom user information dictionary for the call from caller. May be nil.
        NSDictionary *userInfo = @{ @"key" : @"value" };
        [session rejectCall:userInfo];
        return;
    }
    self.session = session;
}

self.session - this refers to this session. Each particular audio - video call has a unique sessionID. This allows you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to 60 seconds:

[QBRTCConfig setAnswerTimeInterval:60];

Accept call

To accept a call request just use this method:

// userInfo - the custom user information dictionary for the accept call. May be nil.
NSDictionary *userInfo = @{ @"key" : @"value" };
[self.session acceptCall:userInfo];

After this your opponent will receive an accept signal:

#pragma mark -
#pragma mark QBRTCClientDelegate
 
- (void)session:(QBRTCSession *)session acceptedByUser:(NSNumber *)userID userInfo:(NSDictionary *)userInfo {
 
}

Reject call

To reject a call request just use this method:

// userInfo - the custom user information dictionary for the reject call. May be nil.
NSDictionary *userInfo = @{ @"key" : @"value" };
[self.session rejectCall:userInfo];
 
// and release session instance
self.session = nil;

After this your opponent will receive a reject signal:

#pragma mark -
#pragma mark QBRTCClientDelegate
 
- (void)session:(QBRTCSession *)session rejectedByUser:(NSNumber *)userID userInfo:(NSDictionary *)userInfo  {
    NSLog(@"Rejected by user %@", userID);
}

Connection life-cycle

Called when local media stream has successfully initialized itself and configured tracks: called after startCall: or accept: methods

#pragma mark -
#pragma mark QBRTCClientDelegate
 
- (void)session:(QBRTCSession *)session initializedLocalMediaStream:(QBRTCMediaStream *)mediaStream {
    NSLog(@"Initialized local media stream %@", mediaStream);
}

Called when connection is initiated with user:

#pragma mark -
#pragma mark QBRTCClientDelegate
 
- (void)session:(QBRTCSession *)session startedConnectingToUser:(NSNumber *)userID {
 
    NSLog(@"Started connecting to user %@", userID);
}

Called when connection is closed for user

#pragma mark -
#pragma mark QBRTCClientDelegate
 
- (void)session:(QBRTCSession *)session connectionClosedForUser:(NSNumber *)userID {
 
    NSLog(@"Connection is closed for user %@", userID);
}

Called in case when connection is established with user:

#pragma mark -
#pragma mark QBRTCClientDelegate
 
- (void)session:(QBRTCSession *)session connectedToUser:(NSNumber *)userID {
 
    NSLog(@"Connection is established with user %@", userID);
}

Called in case when user is disconnected:

#pragma mark -
#pragma mark QBRTCClientDelegate
 
- (void)session:(QBRTCSession *)session disconnectedFromUser:(NSNumber *)userID {
    NSLog(@"Disconnected from user %@", userID);
}
 
// use [QBRTCConfig setDisconnectTimeInterval:value] to set disconnect time interval
- (void)session:(QBRTCSession *)session disconnectedByTimeoutFromUser:(NSNumber *)userID {
    // use [QBRTCConfig setDisconnectTimeInterval:value] to set disconnect time interval
    NSLog(@"Disconnected from user %@ by timeout", userID);
}


Called in case when user did not respond to your call within timeout.
note: use +[QBRTCConfig setAnswerTimeInterval:value] to set answer time interval

#pragma mark -
#pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session userDidNotRespond:(NSNumber *)userID {
    NSLog(@"User %@ did not respond to your call within timeout", userID);
}

Called in case when connection failed with user.

#pragma mark -
#pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session connectionFailedForUser:(NSNumber *)userID {
    NSLog(@"Connection has failed with user %@", userID);
}

Manage remote video tracks

In order to show video views with streams which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

#pragma mark -
#pragma mark QBRTCClientDelegate
 
//Called in case when receive remote video track from opponent
- (void)session:(QBRTCSession *)session receivedRemoteVideoTrack:(QBRTCVideoTrack *)videoTrack fromUser:(NSNumber *)userID {
 
    // we suppose you have created UIView and set it's class to QBRTCRemoteVideoView class
    // also we suggest you to set view mode to UIViewContentModeScaleAspectFit or
    // UIViewContentModeScaleAspectFill
    [self.opponentVideoView setVideoTrack:videoTrack];
}

Available pixel formats

We have implemented support of the following pixel formats:

QBRTCPixelFormat

/**
 *   Bi-Planar Component Y'CbCr 8-bit 4:2:0, full-range (luma=[0,255] chroma=[1,255]).  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct 
 */
QBRTCPixelFormat420f = kCVPixelFormatType_420YpCbCr8BiPlanarFullRange,
/**
 *  Bi-Planar Component Y'CbCr 8-bit 4:2:0, full-range (luma=[0,255] chroma=[1,255]).  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct
 */
QBRTCPixelFormat420v = kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange,
 
/**
 *  32 bit BGRA 
 */
QBRTCPixelFormatBGRA = kCVPixelFormatType_32BGRA,
/**
 *  32 bit ARGB
 */
QBRTCPixelFormatARGB = kCVPixelFormatType_32ARGB,

Manage local video track

In order to show your local video track from camera you should create UIView on storyboard and then use the following code:

// your view controller interface code
 
@interface CallController()
@property (weak, nonatomic) IBOutlet UIView *localVideoView; // your video view to render local camera video stream
@property (strong, nonatomic) QBRTCCameraCapture *videoCapture;
@end
 
@implementation CallController
 
- (void)viewDidLoad {
    [super viewDidLoad];
    [QBRTCClient.instance addDelegate:self];
 
    QBRTCVideoFormat *videoFormat = [[QBRTCVideoFormat alloc] init];
    videoFormat.frameRate = 30;
    videoFormat.pixelFormat = QBRTCPixelFormat420f;
    videoFormat.width = 640;
    videoFormat.height = 480;
 
    // QBRTCCameraCapture class used to capture frames using AVFoundation APIs
    self.videoCapture = [[QBRTCCameraCapture alloc] initWithVideoFormat:videoFormat position:AVCaptureDevicePositionFront]; // or AVCaptureDevicePositionBack
 
    self.videoCapture.previewLayer.frame = self.localVideoView.bounds;
    [self.videoCapture startSession];
 
    [self.localVideoView.layer insertSublayer:self.videoCapture.previewLayer atIndex:0];
    self.session.localMediaStream.videoTrack.videoCapture = self.videoCapture;
 
   // start call
}
 
...
 
@end

Hang up

To hang a up call:

NSDictionary *userInfo = @{ @"key" : @"value" }
[self.session hangUp:userInfo];

After this your opponent's will receive a hangUp signal

#pragma mark -
#pragma mark QBRTCClientDelegate
 
- (void)session:(QBRTCSession *)session hungUpByUser:(NSNumber *)userID userInfo:(NSDictionary *)userInfo {
    //For example:Update GUI
}

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

#pragma mark -
#pragma mark QBRTCClientDelegate
 
- (void)sessionDidClose:(QBRTCSession *)session {
 
    // release session instance
    self.session = nil;
}

Disable / enable audio stream

You can disable / enable the audio stream during a call:

self.session.localMediaStream.audioTrack.enabled = !self.session.localMediaStream.audioTrack.isEnabled;

Please note: due to webrtc restrictions silence will be placed into stream content if audio is disabled.

Disable / enable video stream

You can disable / enable the video stream during a call:

self.session.localMediaStream.videoTrack.enabled = !self.session.localMediaStream.videoTrack.isEnabled;

Please note: due to webrtc restrictions black frames will be placed into stream content if video is disabled.

Switch camera

You can switch the video capture position during a call (Default: front camera):

'videoCapture' below is QBRTCCameraCapture described in CallController above

// to set default (preferred) camera position
- (void)viewDidLoad {
    [super viewDidLoad];
    QBRTCVideoFormat *videoFormat = [[QBRTCVideoFormat alloc] init];
    videoFormat.frameRate = 30;
    videoFormat.pixelFormat = QBRTCPixelFormat420f;
    videoFormat.width = 640;
    videoFormat.height = 480;
 
    self.videoCapture = [[QBRTCCameraCapture alloc] initWithVideoFormat:videoFormat position:AVCaptureDevicePositionFront]; // or AVCaptureDevicePositionBack
}
 
// to change some time after, for example, at the moment of call
 
AVCaptureDevicePosition position = [self.videoCapture currentPosition];
AVCaptureDevicePosition newPosition = position == AVCaptureDevicePositionBack ? AVCaptureDevicePositionFront : AVCaptureDevicePositionBack;
 
// check whether videoCapture has or has not camera position
// for example, some iPods do not have front camera 
if ([self.videoCapture hasCameraForPosition:newPosition]) {
    [self.videoCapture selectCameraPosition:newPosition];
}

Sound router

//Save current audio configuration before start call or accept call
[QBRTCSoundRouter.instance initialize];
//Set headphone
QBRTCSoundRouter.instance.currentSoundRoute = QBRTCSoundRouteReceiver;
//or set speaker
QBRTCSoundRouter.instance.currentSoundRoute = QBRTCSoundRouteSpeaker;
//deinitialize after session close 
[QBRTCSoundRouter.instance deinitialize];

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is important not to skip this step.

There is also a UI for setting app background modes in XCode 5. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

Xcode5 capabilities.png

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a red background of the status bar, as well as an additional bar indicating the name of the app holding the active audio session — in this case, your app.

Bg mode.png

Screen sharing

We are happy to introduce you a new feature of QuickbloxWebRTC SDK — Screen sharing.

I can see it.jpg I see you.jpg

Implementing screen sharing allows you to share information from your application to all of your opponents. It gives you an ability to promote your product, share a screen with formulas to students, distribute podcasts, share video/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the ability to create custom video capture.

Video capture is a base class you should inherit from in order to send frames you your opponents.

Custom video capture

QBRTCVideoCapture class allows to send frames to your opponents.

By inheriting this class you are able to provide custom logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom video capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that allows your application to synchronize its drawing to the refresh rate of the display.

For full source code of custom capture and additional methods please refer to sample-videochat-webrtc sample

@interface QBRTCScreenCapture()
 
@property (nonatomic, weak) UIView *view; // screenshots are formed by grabbing content of this view
@property (strong, nonatomic) CADisplayLink *displayLink;
 
@end
 
 
@implementation QBRTCScreenCapture
 
- (instancetype)initWithView:(UIView *)view {    
    self = [super init]; // super inits serial videoQueue
    if (self) {
        _view = view;
    }
    return self;
}
 
// Grab content of the view and return formed screenshot
- (UIImage *)screenshot {
 
    UIGraphicsBeginImageContextWithOptions(_view.frame.size, NO, 1);
    [_view drawViewHierarchyInRect:_view.bounds afterScreenUpdates:NO];
    UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();
 
    return image;
}
 
// QBRTCVideoSource calls this method of our video capture when set
- (void)didSetToVideoTrack:(QBRTCLocalVideoTrack *)videoTrack {
    [super didSetToVideoTrack:videoTrack];
 
    self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(sendPixelBuffer:)];
    [self.displayLink addToRunLoop:[NSRunLoop mainRunLoop] forMode:NSRunLoopCommonModes];
    self.displayLink.frameInterval = 12; //5 fps
}
 
- (void)sendPixelBuffer:(CADisplayLink *)sender {
    //Convert to unix nanosec
    int64_t timeStamp = sender.timestamp * NSEC_PER_SEC;
 
    dispatch_async(self.videoQueue, ^{
 
        @autoreleasepool {
 
            UIImage *image = [self screenshot];
 
            int w = image.size.width;
            int h = image.size.height;
 
            NSDictionary *options = @{
                                      (NSString *)kCVPixelBufferCGImageCompatibilityKey : @NO,
                                      (NSString *)kCVPixelBufferCGBitmapContextCompatibilityKey : @NO
                                      };
 
            CVPixelBufferRef pixelBuffer = nil;
 
            // allocate space needed by pixel buffer
            CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
                                                  w,
                                                  h,
                                                  kCVPixelFormatType_32ARGB,
                                                  (__bridge CFDictionaryRef)(options),
                                                  &pixelBuffer);
 
            if(status != kCVReturnSuccess && pixelBuffer == NULL) {
 
                return;
            }
 
            CVPixelBufferLockBaseAddress(pixelBuffer, 0);
            void *pxdata = CVPixelBufferGetBaseAddress(pixelBuffer);
 
            CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
 
            uint32_t bitmapInfo = kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst;
 
            CGContextRef context =
            CGBitmapContextCreate(pxdata, w, h, 8, w * 4, rgbColorSpace, bitmapInfo);
 
            CGContextDrawImage(context, CGRectMake(0, 0, w, h), [image CGImage]);
            CGColorSpaceRelease(rgbColorSpace);
            CGContextRelease(context);
 
            QBRTCVideoFrame *videoFrame = [[QBRTCVideoFrame alloc] initWithPixelBuffer:pixelBuffer];
            videoFrame.timestamp = timeStamp;
 
            // capture videoFrame and send it to your opponents
            [super sendVideoFrame:videoFrame];
 
            CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
 
            CVPixelBufferRelease(pixelBuffer);
        }
    });
}

To link this capture to your local video track simply use:

//Save previous video capture
self.capture = self.session.localMediaStream.videoTrack.videoCapture;
self.screenCapture = [[QBRTCScreenCapture alloc] initWithView:self.view];
//Switch to sharing
self.session.localMediaStream.videoTrack.videoCapture = self.screenCapture; // here videoTrack calls didSetToVideoTrack:

Framework changelog

v2.0 — November 4, 2015
WebRTC r 10505

1. Fixed performance issues on iPhone 4s
2. Improved stability at low internet speed connection
3. Added support for H264 hardware video codec on iOS
4. Added custom renderers and custom capture to send your custom frames
5. From this version you are able to configure:

Video

  • Quality, pixel format, frames per second (FPS) and bandwidth
  • Choose whether to use VP8 or H264 video codec

Audio

  • Quality and bandwidth
  • Choose Opus, ISAC or iLBC audio codec

6. Sample-video-chat rearchitecture
7. Removed local video track
8. Added remote video track (see QBRTCRemoteVideoView)
9. Full support of AVCaptureSession
10. Improved perfomance in rendering local video track


v1.0.6 — June 17 , 2015
  • WebRTC r 9446
  • Support Quickblox.framework v 2.3


v1.0.5 — June 15 , 2015
  • Added iOS simulator 64 bit
  • Fixed crash (ISAC for armv7 devices)
  • WebRTC r 9437
  • Added #import <AVFoundation/AVFoundation.h> to QBRTCSession


v1.0.4 — May 20 , 2015
  • Remove deprecated methods
  • Updated Background mode
  • WebRTC r 9234
  • added captureSession field to QBRTCSession


v1.0.3 — April 15 , 2015
  • Stability improvement
  • WebRTC r 9004
  • Added captureSession field to QBRTCSession
  • Decreased SDK binary size


v1.0.2 — March 17 , 2015
  • Stability improvement
  • WebRTC r 8729
  • added audioCategoryOptions field to QBRTCSession
  • added skipBlackFrames field to QBGLVideoView (experimental, deprecated in since 1.0.6)
  • Fixes for switch camera]


v1.0.1 — Feb 27, 2015
  • WebRTC r8442
  • Enable / Disable Datagram Transport Layer Security +[QBRTCConfig setDTLSEnabled:]