How do I export UIImage array as a movie?
Take a look at AVAssetWriter and the rest of the AVFoundation framework. The writer has an input of type AVAssetWriterInput, which in turn has a method called appendSampleBuffer: that lets you add individual frames to a video stream. Essentially you’ll have to:
1) Wire the writer:
NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:somePath] fileType:AVFileTypeQuickTimeMovie
error:&error];
NSParameterAssert(videoWriter);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:640], AVVideoWidthKey,
[NSNumber numberWithInt:480], AVVideoHeightKey,
nil];
AVAssetWriterInput* writerInput = [[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain]; //retain should be removed if ARC
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
[videoWriter addInput:writerInput];
2) Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:…] //use kCMTimeZero if unsure
3) Write some samples:
// Or you can use AVAssetWriterInputPixelBufferAdaptor.
// That lets you feed the writer input data from a CVPixelBuffer
// that’s quite easy to create from a CGImage.
[writerInput appendSampleBuffer:sampleBuffer];
4) Finish the session:
[writerInput markAsFinished];
[videoWriter endSessionAtSourceTime:…]; //optional can call finishWriting without specifying endTime
[videoWriter finishWriting]; //deprecated in ios6
/*
[videoWriter finishWritingWithCompletionHandler:...]; //ios 6.0+
*/
You’ll still have to fill-in a lot of blanks, but I think that the only really hard remaining part is getting a pixel buffer from a CGImage
:
- (CVPixelBufferRef) newPixelBufferFromCGImage: (CGImageRef) image
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width,
frameSize.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options,
&pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, frameSize.width,
frameSize.height, 8, 4*frameSize.width, rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
NSParameterAssert(context);
CGContextConcatCTM(context, frameTransform);
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
frameSize
is a CGSize
describing your target frame size and frameTransform
is a CGAffineTransform
that lets you transform the images when you draw them into frames.
create movie from [UIImage], Swift
Constructing a Dictionary
literal is straightforward:
import AVFoundation
let videoSettings = [
AVVideoCodecKey: AVVideoCodecH264,
AVVideoWidthKey: 640,
AVVideoHeightKey: 480
]
As for everything else, I would encourage you to read through Apple's The Swift Programming Language to establish fundamentals first, rather than relying on SO or tutorials that happen to cover what you want to do. "Teach a man to fish", as they say.
Export UIImage array to video - bluish result
When calling CGBitmapContextCreate
you should pass kCGImageAlphaNoneSkipFirst
and not kCGImageAlphaNoneSkipLast
.
This is because your bitmaps are laid out as alpha then colour and not colour then alpha.
The reason this got you a bluish tint is because you were treating ARG as RGB. A (alpha) was zero, so your image had no red in it. Removing all red from an image will, in general, give you a bluish-green tint.
How do i export images as video in Swift?
After many attempts to make this work i found out what the problem was.
You can't give a video size more little than the pictures' size.
Once i did this, everything worked:
let size = CGSize(width: 1920, height: 1280)
Related Topics
Swift - Json Error: the Data Couldn'T Be Read Because It Isn'T in the Correct Format
Ansible Regex_Findall Multiple Strings
How to Develop For Iphone Using a Windows Development Machine
Getting the Difference Between Two Dates (Months/Days/Hours/Minutes/Seconds) in Swift
Ios 9 Not Opening Instagram App With Url Scheme
Why Is There Extra Padding At the Top of My Uitableview With Style Uitableviewstylegrouped in Ios7
Iphone: Detecting User Inactivity/Idle Time Since Last Screen Touch
How to Use Timer (Formerly Nstimer) in Swift
How to Find Out the Type of an Object (In Swift)
Load Resources from Relative Path Using Local HTML in Uiwebview
How to Increment the Filename If File Already Exists
Uidevice Uniqueidentifier Deprecated - What to Do Now
Opening the Settings App from Another App
Difference Between 'Yyyy' and 'Yyyy' in Nsdateformatter
How to Load Custom Uitableviewcells from Xib Files
Cell Spacing in Uicollectionview
The Best Way to Remove Duplicate Values from Nsmutablearray in Objective-C