How to Record Video of Front and Back Camera At a Time in Ios

Capture video simultaneously from both front and rear camera

I'm fairly sure that this is not possible in the publicly available API's.

You are correct in your assertion that the UIImagePicker only allows for a single camera to be specified.

Facetime, which can do the PiP your looking for, is a low level part of the iPhone and, being crafted by Apple, probably has a massive amount of optimisations and low level code to make this happen.

EDIT: After the comment from Nestor, I had a more detailed lookinto how Factuime works (not having an iPhone myself). Nestor is correct, it doesn't show streams from both cameras, rather the other phones video and then a PIP of your camera.

Switch between front and back camera while recording a video

Since the question I think will help answer your question is in Objective-C and you would prefer Swift, I've "translated" all that code below.

Be warned, I did not compile this, and know several things won't compile to start. Enums values like AVMediaTypeVideo are usually just .video in Swift. Also, I'm pretty sure that answer has some incorrect code, mainly surrounding setting the isFrontRecording and isBackRecording booleans back to false. I think those should happen within the completionHandler, but as mentioned I did not compile this so take that with a grain of salt. I included all code from that question (the Objective-C) along with my quick&dirty translation to Swift.

I hope this helps, though :)

Objective-C:

/* Front camera settings */
@property bool isFrontRecording;
@property (strong, nonatomic) AVCaptureDeviceInput *videoInputBack;
@property (strong, nonatomic) AVCaptureStillImageOutput *imageOutputBack;
@property (strong, nonatomic) AVCaptureSession *sessionBack;

/* Back camera settings */
@property bool isBackRecording;
@property (strong, nonatomic) AVCaptureDeviceInput *videoInputFront;
@property (strong, nonatomic) AVCaptureStillImageOutput *imageOutputFront;
@property (strong, nonatomic) AVCaptureSession *sessionFront;

Swift:

var isFrontRecording: Bool
var videoInputBack: AVCaptureDeviceInput
var imageOutputBack: AVCaptureStillImageOutput
var sessionBack: AVCaptureSession

var isBackRecording: Bool
var videoInputFront: AVCaptureDeviceInput
var imageOutputFront: AVCaptureStillImageOutput
var sessionFront: AVCaptureSession

Objective-C

- (void)viewDidLoad {
[super viewDidLoad];

[self setupBackAVCapture];

self.isFrontRecording = NO;
self.isBackRecording = NO;
}

- (void)setupBackAVCapture
{
NSError *error = nil;

self.sessionBack = [[AVCaptureSession alloc] init];
self.sessionBack.sessionPreset = AVCaptureSessionPresetPhoto;

AVCaptureDevice *camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

self.videoInputBack = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error];
[self.sessionBack addInput:self.videoInputBack];

self.imageOutputBack = [[AVCaptureStillImageOutput alloc] init];
[self.sessionBack addOutput:self.imageOutputBack];

}

Swift:

override func viewDidLoad() {
super.viewDidLoad()
setupBackAVCapture()

isFrontRecording = false
isBackRecording = false
}

func setupBackAVCapture() {
var error: NSError = nil
sessionBack = AVCaptureSession()
sessionBack.sessionPreset = AVCaptureSessionPresetPhoto

let camera: AVCaptureDevice = AVCaptureDevice(defaultDeviceWithMediaType: AVMediaTypeVideo)
videoInputBack = AVCaptureDeviceInput(withDevice: camera, error: error)
sessionBack.addInput(videoInputBack)

imageOutputBack = AVCaptureStillImageOutput()
sessionBack.addOutput(imageOutputBack)
}

Objective-C:

- (IBAction)buttonCapture:(id)sender {
[self takeBackPhoto];
}

- (void)takeBackPhoto
{
[self.sessionBack startRunning];
if (!self.isFrontRecording) {

self.isFrontRecording = YES;

AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
AVCaptureConnection *videoConnection = [self.imageOutputBack connectionWithMediaType:AVMediaTypeVideo];

if (videoConnection == nil) {
return;
}


[self.imageOutputBack
captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

if (imageDataSampleBuffer == NULL) {
return;
}

NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];

UIImage *image = [[UIImage alloc] initWithData:imageData];

UIImageWriteToSavedPhotosAlbum(image, self, nil, nil);

[self.imageView setImage:image];

[self.sessionBack stopRunning];

// Set up front camera setting and capture photo.
[self setupFrontAVCapture];
[self takeFrontPhoto];

}];

self.isFrontRecording = NO;
}
}

Swift:

@IBOutlet func buttonCapture(sender: Any) {
takeBackPhoto()
}

func takeBackPhoto() {
sessionBack.startRunning()
if !isFrontRecording {
isFrontRecording = true

AudioServicesPlaySystemSound(kSystemSoundID_Vibrate)
let videoConnection: AVCaptureConnection = imageOutputBack.connectionWithMediaType(AVMediaTypeVideo)

guard let videoConnection = videoConnection else {
return
}

imageOutputBack.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {
imageDataSampleBuffer: CMSSampleBufferRef, error: NSError in

guard let imageDataSampleBuffer = imageDataSampleBuffer else {
return
}

let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
let image = UIImage(data: imageData)
UIImageWriteToSavedPhotosAlbum(image, self, nil, nil)
self.imageView.setImage(image)
self.sessionback.stopRunning()

// Set up front camera setting and capture photo.
self.setupFronAVCapture()
self.takeFrontPhoto()
})

isFrontRecording = false
}
}

Objective-C

- (void)setupFrontAVCapture
{
NSError *error = nil;

self.sessionFront = [[AVCaptureSession alloc] init];
self.sessionFront.sessionPreset = AVCaptureSessionPresetPhoto;

AVCaptureDevice *camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
camera = [self cameraWithPosition:AVCaptureDevicePositionFront];

self.videoInputFront = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error];
[self.sessionFront addInput:self.videoInputFront];

self.imageOutputFront = [[AVCaptureStillImageOutput alloc] init];
[self.sessionFront addOutput:self.imageOutputFront];
}

- (void)takeFrontPhoto
{
[self.sessionFront startRunning];
if (!self.isBackRecording) {

self.isBackRecording = YES;

AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
AVCaptureConnection *videoConnection = [self.imageOutputFront connectionWithMediaType:AVMediaTypeVideo];

if (videoConnection == nil) {
return;
}


[self.imageOutputFront
captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

if (imageDataSampleBuffer == NULL) {
return;
}

NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];

UIImage *image = [[UIImage alloc] initWithData:imageData];

UIImageWriteToSavedPhotosAlbum(image, self, nil, nil);
[self.imageViewBack setImage:image];

[self.sessionFront stopRunning];


}];

self.isBackRecording = NO;

}

}

Swift:

func setupFrontAVCapture() {
let error: NSError = nil
sessionFront = AVCaptureSession()
sessionFront.sessionPreset = AVCaptureSessionPresentPhoto

var camera: AVCaptureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
camera = camera.cameraWithPosition(AVCaptureDevicePositionFront)

videoInputFront = AVCaptureDeviceInput(withDevice: camera, error: error)
sessionFront.addInput(videoInputFront)

imageOutputFront = AVCaptureStillImageOutput()
sessionFront.addOutput(imageOutputFront)
}

func takeFrontPhoto() {
sessionFront.startRunning()

if !isBackRecording {
isBackRecording = true

AudioServicesPlaySystemSound(kSystemSoundID_Vibrate)
let videoConnection: AVCaptureConnection = imageOutputFront.connectionWithMediaType(AVMediaTypeVideo)

guard let videoConnection = videoConnection else {
return
}

imageOutputFront.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {
imageDataSampleBuffer: CMSampleBufferRef, error: NSError in

guard let imageDataSampleBuffer = imageDataSampleBuffer else {
return
}

let imageData: NSData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
let image = UIImage(data: imageData)

UIImageWriteToSavedPhotosAlbum(image, self, nil, nil)
self.imageViewBack.setImage(image)
self.sessionFront.stopRunning()
})

isBackRecording = false
}
}

Good luck getting the switching to work for your project!

iPhone 4 AVFoundation : Capture from front and rear cameras simultaneously

Answering my own question:

  1. This is not possible.
  2. Switching between front and rear camera to emulate similar behavior is too slow
    (Takes about 500ms per switch according to my tests)

Source: https://devforums.apple.com/message/369748#369748



Related Topics



Leave a reply



Submit