Impelementation of Rtcdatachannel of Webrtc in iOS

Impelementation of RTCDataChannel of WebRTC in iOS

I am able to send data through RTCDataChannel. What I did is before sending the offer. I created the RTCDataChannelInit with the below configuration.

RTCDataChannelInit *datainit = [[RTCDataChannelInit alloc] init];

datainit.isNegotiated = YES;

datainit.isOrdered = YES;

datainit.maxRetransmits = 30;

datainit.maxRetransmitTimeMs = 30000;

datainit.streamId = 1;
self.dataChannel = [_peerConnection createDataChannelWithLabel:@"commands" config:datainit];
self.dataChannel.delegate=self;

Once both the devices get connected, I checked the state in the delegate function. The state of the channel is open.

- (void)channelDidChangeState:(RTCDataChannel*)channel
{
NSLog(@"channel.state %u",channel.state);
}

Then I send the data as per the below code:

RTCDataBuffer *buffer = [[RTCDataBuffer alloc] initWithData:[str dataUsingEncoding:NSUTF8StringEncoding] isBinary:NO];
BOOL x = [self.dataChannel sendData:buffer];

The configuration I used was given here:
https://groups.google.com/forum/#!searchin/discuss-webrtc/RTCDataChannel/discuss-webrtc/9NObqxnItCg/mRvXBIwkA7wJ

Building iOS Native App using WebRTC

I am not an expert in webrtc but i will try to explain some of your questions.

1.ICE servers-- NATs and firewalls impose significant problem in setting up IP endpoints. so IETF standards STUN, TURN and ICE were developed to address the NAT traversal problem.
STUN helps connect IP end-points:

  • discover whether they are behind a NAT/firewall, and if so,
  • to determine the public IP address and type of the firewall. STUN then uses this information to assist in establishing peer-to-peer IP connectivity.

TURN, which stands for Traversal Using Relay NAT, provides a fallback NAT traversal technique using a media relay server to facilitate media transport between end-points.

ICE is a framework that leverages both STUN and TURN to provide reliable IP set-up and media transport, through a SIP offer/answer model for end-points to exchange multiple candidate IP addresses and ports (such as private addresses and TURN server addresses).

2.Signaling is the process of coordinating communication. This signalling part needs to be implemented by you according to your needs(for ex. if you have sip structure in place then you will have to implement sip signalling). In order for a WebRTC application to set up a 'call', its clients need to exchange information:

  • Session control messages used to open or close communication.
  • Error messages.
  • Media metadata such as codecs and codec settings, bandwidth and media types.
  • Key data, used to establish secure connections.
  • Network data, such as a host's IP address and port as seen by the outside world.


    1. Steps

    for offerer:

  • first create the peer connection and pass the ice candidates into it
    as parameters.

  • set event handlers for three events:

    • onicecandidate-- onicecandidate returns locally generated ICE candidates so you can pass them over other peer(s) i.e. list of ice candidates that are returned by STUN/TURN servers; these ice candidates contains your public ipv4/ipv6 addresses as well as UDP random addresses
    • onaddstream--onaddstream returns remote stream (microphone and camera of your friend!).

      • addStream` attaches your local microphone and camera for other peer.

Now create SDP offer by calling setLocalDescription function and set remote SDP by calling setRemoteDescription.

For Answerer:

  • setRemoteDescription
  • createAnswer
  • setLocalDescription
  • oniceCandidate--On getting locally generated ICE
  • addiceCandidate--On getting ICE sent by other peer
  • onaddstream--for remote stream to add

I hope this will make some of your doubts clear.

How to get frame data in AppRTC iOS app for video modifications?

Just after posting the question I had a luck in finding the sneaky data!

You have to add the following property to RTCEAGLVideoView.h file:

@property(atomic, strong) RTCI420Frame* i420Frame;

In the original implementation file there is the i420Frame property but it wasn't exposed in the iOS project's header file for the class. Adding the property allows you to get view's current frame.

I'm still in search of a more elegant way of getting the stream data directly, without the need to look into remoteView contents, will update the answer once I find it.

One to many webrtc

I would suggest against a one-to-many architecture where a single device needs to send its media to all others. This breaks awfully fast (like after 2-3 devices it needs to connect to).

The reason for that is that uplinks are usually limited in capacity and even when they aren't, devices aren't really geared to streaming so much data to many other devices.

To do what you want at "scale", use a server component that routes media to the other devices. Look at https://jitsi.org/ and http://www.kurento.org/ for starting points.



Related Topics



Leave a reply



Submit