Square Video Using Avfoundation

Square video using AVFoundation

I have updated the github repository with the code.

These were the errors that I rectified :

  • I was making mistake while creating the url for documents directory hence i was not able to use the video.
  • I was not creating the videoComposition properly. There were issue while creating the CMTimeRange.

Please visit the repository and check for yourself.
Also upvote if the code helped you in anyway.
Thanks!!

Repo link - https://github.com/ankit-betterbutter/CustomCamera

Record square video using AVFoundation and add watermark

A few things:

As far as Audio goes, you're adding a Video (camera) input, but no Audio input. So do that to get sound.

    let audioInputDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeAudio)

do {
let input = try AVCaptureDeviceInput(device: audioInputDevice)

if sourceAVFoundation.captureSession.canAddInput(input) {
sourceAVFoundation.captureSession.addInput(input)
} else {
NSLog("ERROR: Can't add audio input")
}
} catch let error {
NSLog("ERROR: Getting input device: \(error)")
}

To make the video square, you're going to have to look at using AVAssetWriter instead of AVCaptureFileOutput. This is more complex, but you get more "power". You've created an AVCaptureSession already which is great, to hook up the AssetWriter, you'll need to do something like this:

    let fileManager = NSFileManager.defaultManager()
let urls = fileManager.URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)
guard let documentDirectory: NSURL = urls.first else {
print("Video Controller: getAssetWriter: documentDir Error")
return nil
}

let local_video_name = NSUUID().UUIDString + ".mp4"
self.videoOutputURL = documentDirectory.URLByAppendingPathComponent(local_video_name)

guard let url = self.videoOutputURL else {
return nil
}

self.assetWriter = try? AVAssetWriter(URL: url, fileType: AVFileTypeMPEG4)

guard let writer = self.assetWriter else {
return nil
}

//TODO: Set your desired video size here!
let videoSettings: [String : AnyObject] = [
AVVideoCodecKey : AVVideoCodecH264,
AVVideoWidthKey : captureSize.width,
AVVideoHeightKey : captureSize.height,
AVVideoCompressionPropertiesKey : [
AVVideoAverageBitRateKey : 200000,
AVVideoProfileLevelKey : AVVideoProfileLevelH264Baseline41,
AVVideoMaxKeyFrameIntervalKey : 90,
],
]

assetWriterInputCamera = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoSettings)
assetWriterInputCamera?.expectsMediaDataInRealTime = true
writer.addInput(assetWriterInputCamera!)

let audioSettings : [String : AnyObject] = [
AVFormatIDKey : NSInteger(kAudioFormatMPEG4AAC),
AVNumberOfChannelsKey : 2,
AVSampleRateKey : NSNumber(double: 44100.0)
]

assetWriterInputAudio = AVAssetWriterInput(mediaType: AVMediaTypeAudio, outputSettings: audioSettings)
assetWriterInputAudio?.expectsMediaDataInRealTime = true
writer.addInput(assetWriterInputAudio!)

Once you have the AssetWriter setup... then hook up some outputs for the Video and Audio

    let bufferAudioQueue = dispatch_queue_create("audio buffer delegate", DISPATCH_QUEUE_SERIAL)
let audioOutput = AVCaptureAudioDataOutput()
audioOutput.setSampleBufferDelegate(self, queue: bufferAudioQueue)
captureSession.addOutput(audioOutput)

// Always add video last...
let videoOutput = AVCaptureVideoDataOutput()
videoOutput.setSampleBufferDelegate(self, queue: bufferVideoQueue)
captureSession.addOutput(videoOutput)
if let connection = videoOutput.connectionWithMediaType(AVMediaTypeVideo) {
if connection.supportsVideoOrientation {
// Force recording to portrait
connection.videoOrientation = AVCaptureVideoOrientation.Portrait
}

self.outputConnection = connection
}

captureSession.startRunning()

Finally you need to capture the buffers and process that stuff... Make sure you make your class a delegate of AVCaptureVideoDataOutputSampleBufferDelegate and AVCaptureAudioDataOutputSampleBufferDelegate

//MARK: Implementation for AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {

if !self.isRecordingStarted {
return
}

if let audio = self.assetWriterInputAudio where connection.audioChannels.count > 0 && audio.readyForMoreMediaData {

dispatch_async(audioQueue!) {
audio.appendSampleBuffer(sampleBuffer)
}
return
}

if let camera = self.assetWriterInputCamera where camera.readyForMoreMediaData {
dispatch_async(videoQueue!) {
camera.appendSampleBuffer(sampleBuffer)
}
}
}

There are a few missing bits and pieces, but hopefully this is enough for you to figure it out along with the documentation.

Finally, if you want to add the watermark, there are many ways this can be done in real time, but one possible way is to modify the sampleBuffer and write the watermark into the image then. You'll find other question on StackOverflow dealing with that.

Square video output in iOS

To rotate the CMSampleBuffer, you should attend to this Apple technote:

https://developer.apple.com/library/ios/qa/qa1744/_index.html

In particular, if you want to physically rotate the video (as opposed to just setting an orientation flag) you can..

for example in the callback :

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection

if you do this:

    [connection setVideoOrientation:AVCaptureVideoOrientationPortraitUpsideDown];

you will get an upside-down video.

To crop the video, you need to use an AVAssetWriterInput, in which you can set the crop using the videoSettings dictionary.

For example:

NSDictionary *videoSettings = @{
AVVideoCodecKey : AVVideoCodecH264
, AVVideoWidthKey : @(100)
, AVVideoHeightKey : @(100)
};

used here:

  self.assetWriterVideoInput = [[AVAssetWriterInput alloc]
initWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings];

will give you a video sized to 100 x 100 px, full width but cropped height to square.

Check out AVVideoSettings.h for the full list of keys

Crop video with AVFoundation

Notice in the // rotate to portrait portion of the code they're making affine transformations. Modify those transformations to align the video how you want.

How do I use AVFoundation to crop a video

Soemthing like this. 99% of this code just sets it up to do a custom CGAffineTransform, and then save out the result.

I'm assuming that you want the cropped video to take up full size/width of the output - so that e.g a Scale Affine is the correct solution (you zoom in on the video, giving the effect of having cropped + resized).

AVAsset* asset = // your input

AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

AVMutableVideoComposition* videoComposition = [[AVMutableVideoComposition videoComposition]retain];
videoComposition.renderSize = CGSizeMake(320, 240);
videoComposition.frameDuration = CMTimeMake(1, 30);

AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30) );

AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];
CGAffineTransform finalTransform = // setup a transform that grows the video, effectively causing a crop
[transformer setTransform:finalTransform atTime:kCMTimeZero];
instruction.layerInstructions = [NSArray arrayWithObject:transformer];
videoComposition.instructions = [NSArray arrayWithObject: instruction];

exporter = [[AVAssetExportSession alloc] initWithAsset:saveComposition presetName:AVAssetExportPresetHighestQuality] ;
exporter.videoComposition = videoComposition;
exporter.outputURL=url3;
exporter.outputFileType=AVFileTypeQuickTimeMovie;

[exporter exportAsynchronouslyWithCompletionHandler:^(void){}];

Record crop video square aspect ratio AVCaptureSession

You can record square video using this demo code: https://github.com/DarshanRlogical/DKCustomCamera

OR

You can crop video by using below method:

func manageCroppingToSquare(filePath: URL , completion: @escaping (_ outputURL : URL?) -> ()) {

// output file
let documentsURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first
let outputPath = documentsURL?.appendingPathComponent("squareVideo.mov")
if FileManager.default.fileExists(atPath: (outputPath?.path)!) {
do {
try FileManager.default.removeItem(atPath: (outputPath?.path)!)
}
catch {
print ("Error deleting file")
}
}

//input file
let asset = AVAsset.init(url: filePath)
print (asset)
let composition = AVMutableComposition.init()
composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)

//input clip
let clipVideoTrack = asset.tracks(withMediaType: AVMediaTypeVideo)[0]

//make it square
let videoComposition = AVMutableVideoComposition()
videoComposition.renderSize = CGSize(width: CGFloat(clipVideoTrack.naturalSize.height), height: CGFloat(clipVideoTrack.naturalSize.height))
videoComposition.frameDuration = CMTimeMake(1, 30)
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30))

//rotate to potrait
let transformer = AVMutableVideoCompositionLayerInstruction(assetTrack: clipVideoTrack)
let t1 = CGAffineTransform(translationX: clipVideoTrack.naturalSize.height, y: -(clipVideoTrack.naturalSize.width - clipVideoTrack.naturalSize.height) / 2)
let t2: CGAffineTransform = t1.rotated(by: .pi/2)
let finalTransform: CGAffineTransform = t2
transformer.setTransform(finalTransform, at: kCMTimeZero)
instruction.layerInstructions = [transformer]
videoComposition.instructions = [instruction]

//exporter
let exporter = AVAssetExportSession.init(asset: asset, presetName: AVAssetExportPresetMediumQuality)
exporter?.outputFileType = AVFileTypeQuickTimeMovie
exporter?.outputURL = outputPath
exporter?.videoComposition = videoComposition

exporter?.exportAsynchronously() { handler -> Void in
if exporter?.status == .completed {
print("Export complete")
DispatchQueue.main.async(execute: {
completion(outputPath)
})
return
} else if exporter?.status == .failed {
print("Export failed - \(String(describing: exporter?.error))")
}
completion(nil)
return
}
}

I hope this code will work for you.

Crop Video to Square in iOS [Swift 3]

You're not actually setting the video composition on the exporter

So try

exportSession.videoComposition = videoComposition

before starting the export.

Square cropping and fixing the video orientation in iOS

I suppose the source code come from this link ( project code included )

http://www.one-dreamer.com/cropping-video-square-like-vine-instagram-xcode/

You need first to know the REAL video orientation:

- (UIImageOrientation)getVideoOrientationFromAsset:(AVAsset *)asset
{
AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CGSize size = [videoTrack naturalSize];
CGAffineTransform txf = [videoTrack preferredTransform];

if (size.width == txf.tx && size.height == txf.ty)
return UIImageOrientationLeft; //return UIInterfaceOrientationLandscapeLeft;
else if (txf.tx == 0 && txf.ty == 0)
return UIImageOrientationRight; //return UIInterfaceOrientationLandscapeRight;
else if (txf.tx == 0 && txf.ty == size.width)
return UIImageOrientationDown; //return UIInterfaceOrientationPortraitUpsideDown;
else
return UIImageOrientationUp; //return UIInterfaceOrientationPortrait;
}

I made that function in a way that it return the right orientation as if it was an image

Then, i modified the function to fix the right orientation, supporting any crop region not just a square, like this:

// apply the crop to passed video asset (set outputUrl to avoid the saving on disk ). Return the exporter session object
- (AVAssetExportSession*)applyCropToVideoWithAsset:(AVAsset*)asset AtRect:(CGRect)cropRect OnTimeRange:(CMTimeRange)cropTimeRange ExportToUrl:(NSURL*)outputUrl ExistingExportSession:(AVAssetExportSession*)exporter WithCompletion:(void(^)(BOOL success, NSError* error, NSURL* videoUrl))completion
{

//create an avassetrack with our asset
AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

//create a video composition and preset some settings
AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1, 30);

CGFloat cropOffX = cropRect.origin.x;
CGFloat cropOffY = cropRect.origin.y;
CGFloat cropWidth = cropRect.size.width;
CGFloat cropHeight = cropRect.size.height;

videoComposition.renderSize = CGSizeMake(cropWidth, cropHeight);

//create a video instruction
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = cropTimeRange;

AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];

UIImageOrientation videoOrientation = [self getVideoOrientationFromAsset:asset];

CGAffineTransform t1 = CGAffineTransformIdentity;
CGAffineTransform t2 = CGAffineTransformIdentity;

switch (videoOrientation) {
case UIImageOrientationUp:
t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.height - cropOffX, 0 - cropOffY );
t2 = CGAffineTransformRotate(t1, M_PI_2 );
break;
case UIImageOrientationDown:
t1 = CGAffineTransformMakeTranslation(0 - cropOffX, clipVideoTrack.naturalSize.width - cropOffY ); // not fixed width is the real height in upside down
t2 = CGAffineTransformRotate(t1, - M_PI_2 );
break;
case UIImageOrientationRight:
t1 = CGAffineTransformMakeTranslation(0 - cropOffX, 0 - cropOffY );
t2 = CGAffineTransformRotate(t1, 0 );
break;
case UIImageOrientationLeft:
t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.width - cropOffX, clipVideoTrack.naturalSize.height - cropOffY );
t2 = CGAffineTransformRotate(t1, M_PI );
break;
default:
NSLog(@"no supported orientation has been found in this video");
break;
}

CGAffineTransform finalTransform = t2;
[transformer setTransform:finalTransform atTime:kCMTimeZero];

//add the transformer layer instructions, then add to video composition
instruction.layerInstructions = [NSArray arrayWithObject:transformer];
videoComposition.instructions = [NSArray arrayWithObject: instruction];

//Remove any prevouis videos at that path
[[NSFileManager defaultManager] removeItemAtURL:outputUrl error:nil];

if (!exporter){
exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality] ;
}

// assign all instruction for the video processing (in this case the transformation for cropping the video
exporter.videoComposition = videoComposition;
//exporter.outputFileType = AVFileTypeQuickTimeMovie;

if (outputUrl){

exporter.outputURL = outputUrl;
[exporter exportAsynchronouslyWithCompletionHandler:^{

switch ([exporter status]) {
case AVAssetExportSessionStatusFailed:
NSLog(@"crop Export failed: %@", [[exporter error] localizedDescription]);
if (completion){
dispatch_async(dispatch_get_main_queue(), ^{
completion(NO,[exporter error],nil);
});
return;
}
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@"crop Export canceled");
if (completion){
dispatch_async(dispatch_get_main_queue(), ^{
completion(NO,nil,nil);
});
return;
}
break;
default:
break;
}

if (completion){
dispatch_async(dispatch_get_main_queue(), ^{
completion(YES,nil,outputUrl);
});
}

}];
}

return exporter;
}

Tested in all recorded video orientation (Up,Down,Lanscape R, Landscape L) in both normal and front camera cases. I tested it on iPhone 5S (iOS 8.1), iPhone 6 Plus (iOS 8.1)

Hope it helps



Related Topics



Leave a reply



Submit