How to Convert Cmsamplebuffer to Data in Swift

Converting CMSampleBuffer to Data and Data To UIImage does not work, why

Well I found a way it is sort of crude though.

I created data with

  let data = testImage.pngData() as Data?

and streamed this data and converted it to UIImage with

  let image = UIImage(data:Data)

But to do it I converted CMSampleBuffer to UIImage first.

How to convert CMSampleBuffer to OpenCV's Mat instance in swift

First, convert CMSampleBuffer To UIImage.

extension CMSampleBuffer {
func asUIImage()-> UIImage? {
guard let imageBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(self) else {
return nil
}
let ciImage = CIImage(cvPixelBuffer: imageBuffer)
return convertToUiImage(ciImage: ciImage)
}

func convertToUiImage(ciImage: CIImage) -> UIImage? {
let context = CIContext(options: nil)
context.clearCaches()
guard let cgImage = context.createCGImage(ciImage, from: ciImage.extent) else {
return nil
}
let image = UIImage(cgImage: cgImage)
return image
}
}

Then you can easily convert UIImage to Mat and return UIImage with/without doing something.

OpenCVWrapper.h file

#import <Foundation/Foundation.h>
#import <UIKit/UIKit.h>

NS_ASSUME_NONNULL_BEGIN

@interface OpenCVWrapper : NSObject

+ (UIImage *) someOperation : (UIImage *) uiImage;

@end

NS_ASSUME_NONNULL_END

OpenCVWrapper.mm file

#import <opencv2/opencv.hpp>
#import <opencv2/imgcodecs/ios.h>
#import <opencv2/core/types_c.h>
#import "OpenCVWrapper.h"
#import <opencv2/Mat.h>

#include <iostream>
#include <random>
#include <vector>
#include <chrono>
#include <stdint.h>

@implementation OpenCVWrapper

+ (UIImage *) someOperation : (UIImage *) uiImage {
cv::Mat sourceImage;
UIImageToMat(uiImage, sourceImage);
// Do whatever you need
return MatToUIImage(sourceImage);
}

@end

How to create Data from CMSampleBuffer With Swift 4

Just used

       let data = Data(bytes: src_buff!, count: bytesPerRow * height)

instead of

     let nsdata = NSData(bytes: src_buff, length: bytesPerRow * height)  

The key is here was ! after src_buff for Data.
Because xCode was showing some errors which is not related ! usage I could not understand ! was needed.

How to send CMSampleBuffer to WebRTC?

Hi Sam WebRTC have one function which can process CMSampleBuffer frames to get Video Frames. But it is working with CVPixelBuffer. So you have to firstly convert your CMSampleBuffer to CVPixelBuffer. And than add this frames into your localVideoSource with RTCVideoCapturer. i have solved similar problem on AVCaptureVideoDataOutputSampleBufferDelegate. This delegate produces CMSampleBuffer as ReplayKit. i hope that below code lines could be help to you. You can try at the below code lines to solve your problem.

private var videoCapturer: RTCVideoCapturer?
private var localVideoSource = RTCClient.factory.videoSource()
private var localVideoTrack: RTCVideoTrack?
private var remoteVideoTrack: RTCVideoTrack?
private var peerConnection: RTCPeerConnection? = nil
public static let factory: RTCPeerConnectionFactory = {
RTCInitializeSSL()
let videoEncoderFactory = RTCDefaultVideoEncoderFactory()
let videoDecoderFactory = RTCDefaultVideoDecoderFactory()
return RTCPeerConnectionFactory(encoderFactory: videoEncoderFactory, decoderFactory: videoDecoderFactory)
}()

extension RTCClient : AVCaptureVideoDataOutputSampleBufferDelegate {
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
print("didOutPut: \(sampleBuffer)")

guard let imageBuffer: CVImageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
let timeStampNs: Int64 = Int64(CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) * 1000000000)

let rtcPixlBuffer = RTCCVPixelBuffer(pixelBuffer: imageBuffer)
let rtcVideoFrame = RTCVideoFrame(buffer: rtcPixlBuffer, rotation: ._90, timeStampNs: timeStampNs)
self.localVideoSource.capturer(videoCapturer!, didCapture: rtcVideoFrame)

}
}

Also you need configuration like that for mediaSender,

func createMediaSenders() {
let streamId = "stream"

let videoTrack = self.createVideoTrack()
self.localVideoTrack = videoTrack
self.peerConnection!.add(videoTrack, streamIds: [streamId])
self.remoteVideoTrack = self.peerConnection!.transceivers.first { $0.mediaType == .video }?.receiver.track as? RTCVideoTrack

}
private func createVideoTrack() -> RTCVideoTrack {
let videoTrack = RTCClient.factory.videoTrack(with: self.videoSource, trackId: "video0")
return videoTrack
}


Related Topics



Leave a reply



Submit