How to Change Orientation for Avcapturemoviefileoutput in Swift

how to change orientation for AVCaptureMovieFileOutput in swift

I change my code and it works for me

    @IBAction func startStopSession(sender: UIBarButtonItem) {
if movieOutput.recording {
movieOutput.stopRecording()
} else {
print("start recording")
movieOutput.connectionWithMediaType(AVMediaTypeVideo).videoOrientation = returnedOrientation()

if movieOutput.connectionWithMediaType(AVMediaTypeVideo).supportsVideoStabilization {
movieOutput.connectionWithMediaType(AVMediaTypeVideo).preferredVideoStabilizationMode = .Cinematic
}

let digit = returnFileDigit()
let path = fileManager.URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask).last!.path!.stringByAppendingString("/movie-\(digit).mp4")

let url = NSURL(fileURLWithPath: path)
movieOutput.startRecordingToOutputFileURL(url, recordingDelegate: self)
}
}
func returnedOrientation() -> AVCaptureVideoOrientation {
var videoOrientation: AVCaptureVideoOrientation!
let orientation = UIDevice.currentDevice().orientation

switch orientation {
case .Portrait:
videoOrientation = .Portrait
userDefault.setInteger(0, forKey: "CaptureVideoOrientation")
case .PortraitUpsideDown:
videoOrientation = .PortraitUpsideDown
userDefault.setInteger(1, forKey: "CaptureVideoOrientation")
case .LandscapeLeft:
videoOrientation = .LandscapeRight
userDefault.setInteger(2, forKey: "CaptureVideoOrientation")
case .LandscapeRight:
videoOrientation = .LandscapeLeft
userDefault.setInteger(3, forKey: "CaptureVideoOrientation")
case .FaceDown, .FaceUp, .Unknown:
let digit = userDefault.integerForKey("CaptureVideoOrientation")
videoOrientation = AVCaptureVideoOrientation.init(rawValue: digit)
}
return videoOrientation
}

How to change AVCaptureMovieFileOutput video orientation during running session?

Try adding this before you start your session:

[_movieFileOutput setRecordsVideoOrientationAndMirroringChanges:YES asMetadataTrackForConnection:movieFileOutputConnection];

The header file documentation for this method makes it sound very much like what you're looking for:

Controls whether or not the movie file output will create a timed metadata track that records samples which
reflect changes made to the given connection's videoOrientation and videoMirrored properties during
recording.

There's more interesting information there, I'd read it all.

However, this method doesn't actually rotate your frames, it uses timed metadata to instruct players to do it at playback time, so it's possible that not all players will support this feature. If that's a deal breaker, then you can abandon AVCaptureMovieFileOutput in favour of the lower level AVCaptureVideoDataOutput + AVAssetWriter combination, where your videoOrientation changes actually rotate the frames, resulting in files that will playback correctly in any player:

If an AVCaptureVideoDataOutput instance's connection's videoOrientation or videoMirrored properties are set to
non-default values, the output applies the desired mirroring and orientation by physically rotating and or flipping
sample buffers as they pass through it.

p.s. I don't think you need the beginConfiguration/commitConfiguration pair if you're only changing one property as that's for batching multiple modifications into one atomic update.

Video Saving in the wrong orientation AVCaptureSession

It turns out you have to add the connections' orientation to the AVCaptureMovieFileOutput after it is added to the session.

session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
session.sessionPreset = AVCaptureSessionPresetHigh;

AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *cam in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo])
{
if (cam.position == AVCaptureDevicePositionFront)
device = cam;
}

NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(@"ERROR: trying to open camera: %@", error);
}

AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput * audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];

NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectoryPath = [paths objectAtIndex:0];

NSString *outputpathofmovie = [[documentsDirectoryPath stringByAppendingPathComponent:@"RecordedVideo"] stringByAppendingString:@".mp4"];
outputURL = [[NSURL alloc] initFileURLWithPath:outputpathofmovie];

[self deleteTempVideos];

[session addInput:input];
[session addInput:audioInput];
[session commitConfiguration];
[session startRunning];

movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
[session addOutput:movieFileOutput];

AVCaptureConnection *videoConnection = nil;

for ( AVCaptureConnection *connection in [movieFileOutput connections] )
{
NSLog(@"%@", connection);
for ( AVCaptureInputPort *port in [connection inputPorts] )
{
NSLog(@"%@", port);
if ( [[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
}
}
}

if([videoConnection isVideoOrientationSupported]) // **Here it is, its always false**
{
[videoConnection setVideoOrientation:[[UIDevice currentDevice] orientation]];
}

[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];

How to change video orientation for AVCaptureVideoDataOutput

I ran into the same problem and poked around the AVCamDemo from WWDC. I don't know why (yet) but if you query your videoConnection right after you create all the inputs/outputs/connections then both isVideoOrientationSupported and supportsVideoOrientation return NO.

However, if you query supportsVideoOrientation or isVideoOrientationSupported at some later point (after the GUI is setup for instance) then it will return YES. For instance I query it right after the user clicks the record button just before I call [[self movieFileOutput] startRecordingToOutputFileURL...]

Give it a try, works for me.

Problems with AVCaptureSession in landscape mode on iPad

Solved it, now it is working in all orientations. We need to set this

_previewLayer.connection.videoOrientation = [self videoOrientationFromCurrentDeviceOrientation];

where the method is:

 - (AVCaptureVideoOrientation) videoOrientationFromCurrentDeviceOrientation {
switch (self.interfaceOrientation) {
case UIInterfaceOrientationPortrait: {
return AVCaptureVideoOrientationPortrait;
}
case UIInterfaceOrientationLandscapeLeft: {
return AVCaptureVideoOrientationLandscapeLeft;
}
case UIInterfaceOrientationLandscapeRight: {
return AVCaptureVideoOrientationLandscapeRight;
}
case UIInterfaceOrientationPortraitUpsideDown: {
return AVCaptureVideoOrientationPortraitUpsideDown;
}
}
}

Also on capture output we need to set:

AVCaptureConnection *output2VideoConnection = [videoOutput connectionWithMediaType:AVMediaTypeVideo];
output2VideoConnection.videoOrientation = [self videoOrientationFromCurrentDeviceOrientation];

Correctly set the right picture orientation when shooting photo

You may set your shot orientation to whatever you like by setting videoOrientation of your AVCapturePhotoOutput.

To match it with the current device orientation, you may use UIDevice.current.orientation manually converted to AVCaptureVideoOrientation.

let photoOutput = AVCapturePhotoOutput()

func takeShot() {

// set whatever orientation you like
let myShotOrientation = UIDevice.current.orientation.asCaptureVideoOrientation

if let photoOutputConnection = self.photoOutput.connection(with: .video) {
photoOutputConnection.videoOrientation = myShotOrientation
}

photoOutput.capturePhoto(...)
}

Conversion from UIDeviceOrientation to AVCaptureVideoOrientation:

extension UIDeviceOrientation {

///
var asCaptureVideoOrientation: AVCaptureVideoOrientation {
switch self {
// YES, that's not a mistake
case .landscapeLeft: return .landscapeRight
case .landscapeRight: return .landscapeLeft
case .portraitUpsideDown: return .portraitUpsideDown
default: return .portrait
}
}
}

Getting image from AVCaptureMovieFileOutput without switching

I wasn't able to find a way using only AVCaptureMovieFileOutput, however you can add an additional photo output and trigger photos without having to switch between the outputs.

I'm short on time at the moment but this should get you going till I can edit with more info.

(See EDIT with full implementation below, and limited force unwrapping)

First off setup an additional var for a photo output in your view controller

// declare an additional camera output var
var cameraOutput = AVCapturePhotoOutput()

// do this in your 'setupSession' func where you setup your movie output
cameraOutput.isHighResolutionCaptureEnabled = true
captureSession.addOutput(cameraOutput)

Declare a function to capture your photo using the cameraOutput:

func capturePhoto() {
// create settings for your photo capture
let settings = AVCapturePhotoSettings()
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [
kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
kCVPixelBufferWidthKey as String: UIScreen.main.bounds.size.width,
kCVPixelBufferHeightKey as String: UIScreen.main.bounds.size.height
] as [String : Any]
settings.previewPhotoFormat = previewFormat
cameraOutput.capturePhoto(with: settings, delegate: self)
}

and conform to the AVCapturePhotoCaptureDelegate.

I created a separate class called VideoFeed to manage the video capture session, so this sample is an extension of that class. I'll update with more info on this later.

The loadImage(data: Data) function calls a delegate with the image. You can ignore that call if you put this directly in your view controller, and save or do whatever you like with the generated photo:

extension VideoFeed: AVCapturePhotoCaptureDelegate {
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
guard error == nil else {
print("Photo Error: \(String(describing: error))")
return
}

guard let sampleBuffer = photoSampleBuffer,
let previewBuffer = previewPhotoSampleBuffer,
let outputData = AVCapturePhotoOutput
.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) else {
print("Oops, unable to create jpeg image")
return
}

print("captured photo...")
loadImage(data: outputData)
}

func loadImage(data: Data) {
let dataProvider = CGDataProvider(data: data as CFData)
let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)
let image = UIImage(cgImage: cgImageRef, scale: 1.0, orientation: UIImageOrientation.right)
// do whatever you like with the generated image here...
delegate?.processVideoSnapshot(image)
}
}

EDIT:

Here's the complete implementation I used in my test project.

First I moved all the AVFoundation specific code into it's own VideoFeed class and created some callbacks to the view controller.

This separates concerns and limits the view controllers responsibilities to:

  • Adding the preview layer to the view
  • Triggering and handling the captured image/screenshot
  • Starting/stopping video file recording.

Here's the ViewController implementation:

ViewController.swift

import UIKit
import AVFoundation

class ViewController: UIViewController, VideoFeedDelegate {

@IBOutlet var cameraView: UIView!

var videoFeed: VideoFeed?

override func viewDidLoad() {
super.viewDidLoad()
}

override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)

// end session
videoFeed?.stopSession()
}

override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)

// request camera access
AVCaptureDevice.requestAccess(for: AVMediaType.video) { [weak self] granted in
guard granted != false else {
// TODO: show UI stating camera cannot be used, update in settings app...
print("Camera access denied")
return
}
DispatchQueue.main.async {

if self?.videoFeed == nil {
// video access was enabled so setup video feed
self?.videoFeed = VideoFeed(delegate: self)
} else {
// video feed already available, restart session...
self?.videoFeed?.startSession()
}

}
}
}

// MARK: VideoFeedDelegate
func videoFeedSetup(with layer: AVCaptureVideoPreviewLayer) {

// set the layer size
layer.frame = cameraView.layer.bounds

// add to view
cameraView.layer.addSublayer(layer)
}

func processVideoSnapshot(_ image: UIImage?) {

// validate
guard let image = image else {
return
}

// SAVE IMAGE HERE IF DESIRED

// for now just showing in a lightbox/detail view controller
let storyboard = UIStoryboard(name: "Main", bundle: Bundle(for: AppDelegate.self))
let vc = storyboard.instantiateViewController(withIdentifier: "LightboxViewController") as! LightboxViewController
vc.previewImage = image
navigationController?.pushViewController(vc, animated: true)
}

@IBAction func captureButtonTapped(_ sender: Any){

// trigger photo capture from video feed...
// this will trigger a callback to the function above with the captured image
videoFeed?.capturePhoto()
}
}

Here's the full implementation of the VideoFeed class.

Using this approach allows you to reuse the video functionality in other projects more easily without having it tightly coupled to the view controller.

VideoFeed.swift

import UIKit
import AVFoundation

/// Defines callbacks associated with the VideoFeed class. Notifies delegate of significant events.
protocol VideoFeedDelegate: class {

/// Callback triggered when the preview layer for this class has been created and configured. Conforming objects should set and maintain a strong reference to this layer otherwise it will be set to nil when the calling function finishes execution.
///
/// - Parameter layer: The video preview layer associated with the active captureSession in the VideoFeed class.
func videoFeedSetup(with layer: AVCaptureVideoPreviewLayer)

/// Callback triggered when a snapshot of the video feed has been generated.
///
/// - Parameter image: <#image description#>
func processVideoSnapshot(_ image: UIImage?)
}

class VideoFeed: NSObject {

// MARK: Variables

/// The capture session to be used in this class.
var captureSession = AVCaptureSession()

/// The preview layer associated with this session. This class has a
/// weak reference to this layer, the delegate (usually a ViewController
/// instance) should add this layer as a sublayer to its preview UIView.
/// The delegate will have the strong reference to this preview layer.
weak var previewLayer: AVCaptureVideoPreviewLayer?

/// The output that handles saving the video stream to a file.
var fileOutput: AVCaptureMovieFileOutput?

/// A reference to the active video input
var activeInput: AVCaptureDeviceInput?

/// Output for capturing frame grabs of video feed
var cameraOutput = AVCapturePhotoOutput()

/// Delegate to receive callbacks about significant events triggered by this class.
weak var delegate: VideoFeedDelegate?

/// The capture connection associated with the fileOutput.
/// Set when fileOutput is created.
var connection : AVCaptureConnection?

// MARK: Public accessors

/// Public initializer. Accepts a delegate to receive callbacks with the preview layer and any snapshot images.
///
/// - Parameter delegate: A reference to an object conforming to VideoFeedDelegate
/// to receive callbacks for significant events in this class.
init(delegate: VideoFeedDelegate?) {
self.delegate = delegate
super.init()
setupSession()
}

/// Public accessor to begin a capture session.
public func startSession() {
guard captureSession.isRunning == false else {
return
}

captureSession.startRunning()
}

/// Public accessor to end the current capture session.
public func stopSession() {

// validate
guard captureSession.isRunning else {
return
}

// end file recording if the session ends and we're currently recording a video to file
if let isRecording = fileOutput?.isRecording, isRecording {
stopRecording()
}

captureSession.stopRunning()
}

/// Public accessor to begin file recording.
public func startRecording() {

guard fileOutput?.isRecording == false else {
stopRecording()
return
}

configureVideoOrientation()
disableSmoothAutoFocus()

guard let url = tempURL() else {
print("Unable to start file recording, temp url generation failed.")
return
}

fileOutput?.startRecording(to: url, recordingDelegate: self)
}

/// Public accessor to end file recording.
public func stopRecording() {
guard fileOutput?.isRecording == true else {
return
}

fileOutput?.stopRecording()
}

/// Public accessor to trigger snapshot capture of video stream.
public func capturePhoto() {

// create settings object
let settings = AVCapturePhotoSettings()

// verify that we have a pixel format type available
guard let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first else {
print("Unable to configure photo capture settings, 'availablePreviewPhotoPixelFormatTypes' has no available options.")
return
}

let screensize = UIScreen.main.bounds.size

// setup format configuration dictionary
let previewFormat: [String : Any] = [
kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
kCVPixelBufferWidthKey as String: screensize.width,
kCVPixelBufferHeightKey as String: screensize.height
]
settings.previewPhotoFormat = previewFormat

// trigger photo capture
cameraOutput.capturePhoto(with: settings, delegate: self)
}

// MARK: Setup functions

/// Handles configuration and setup of the session, inputs, video preview layer and outputs.
/// If all are setup and configured it starts the session.
internal func setupSession() {

captureSession.sessionPreset = AVCaptureSession.Preset.high
guard setupInputs() else {
return
}

setupOutputs()
setupVideoLayer()
startSession()
}

/// Sets up capture inputs for this session.
///
/// - Returns: Returns true if inputs are successfully setup, else false.
internal func setupInputs() -> Bool {

// only need access to this functionality within this function, so declare as sub-function
func addInput(input: AVCaptureInput) {
guard captureSession.canAddInput(input) else {
return
}

captureSession.addInput(input)
}

do {
if let camera = AVCaptureDevice.default(for: AVMediaType.video) {
let input = try AVCaptureDeviceInput(device: camera)
addInput(input: input)
activeInput = input
}

// Setup Microphone
if let microphone = AVCaptureDevice.default(for: AVMediaType.audio) {
let micInput = try AVCaptureDeviceInput(device: microphone)
addInput(input: micInput)
}

return true
} catch {
print("Error setting device video input: \(error)")
return false
}
}

internal func setupOutputs() {

// only need access to this functionality within this function, so declare as sub-function
func addOutput(output: AVCaptureOutput) {
if captureSession.canAddOutput(output) {
captureSession.addOutput(output)
}
}

// file output
let fileOutput = AVCaptureMovieFileOutput()
captureSession.addOutput(fileOutput)

if let connection = fileOutput.connection(with: .video), connection.isVideoStabilizationSupported {
connection.preferredVideoStabilizationMode = .off
self.connection = connection
}

cameraOutput.isHighResolutionCaptureEnabled = true
captureSession.addOutput(cameraOutput)

}

internal func setupVideoLayer() {
let layer = AVCaptureVideoPreviewLayer(session: captureSession)
layer.videoGravity = AVLayerVideoGravity.resizeAspectFill
delegate?.videoFeedSetup(with: layer)
previewLayer = layer
}

// MARK: Helper functions

/// Creates a url in the temporary directory for file recording.
///
/// - Returns: A file url if successful, else nil.
internal func tempURL() -> URL? {
let directory = NSTemporaryDirectory() as NSString

if directory != "" {
let path = directory.appendingPathComponent(NSUUID().uuidString + ".mp4")
return URL(fileURLWithPath: path)
}

return nil
}

/// Disables smooth autofocus functionality on the active device,
/// if the active device is set and 'isSmoothAutoFocusSupported'
/// is supported for the currently set active device.
internal func disableSmoothAutoFocus() {

guard let device = activeInput?.device, device.isSmoothAutoFocusSupported else {
return
}

do {
try device.lockForConfiguration()
device.isSmoothAutoFocusEnabled = false
device.unlockForConfiguration()
} catch {
print("Error disabling smooth autofocus: \(error)")
}

}

/// Sets the current AVCaptureVideoOrientation on the currently active connection if it's supported.
internal func configureVideoOrientation() {

guard let connection = connection, connection.isVideoOrientationSupported,
let currentOrientation = AVCaptureVideoOrientation(rawValue: UIApplication.shared.statusBarOrientation.rawValue) else {
return
}

connection.videoOrientation = currentOrientation
}
}

// MARK: AVCapturePhotoCaptureDelegate
extension VideoFeed: AVCapturePhotoCaptureDelegate {

// iOS 11+ processing
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
guard error == nil, let outputData = photo.fileDataRepresentation() else {
print("Photo Error: \(String(describing: error))")
return
}

print("captured photo...")
loadImage(data: outputData)
}

// iOS < 11 processing
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {

if #available(iOS 11.0, *) {
// use iOS 11-only feature
// nothing to do here as iOS 11 uses the callback above
} else {
guard error == nil else {
print("Photo Error: \(String(describing: error))")
return
}

guard let sampleBuffer = photoSampleBuffer,
let previewBuffer = previewPhotoSampleBuffer,
let outputData = AVCapturePhotoOutput
.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) else {
print("Image creation from sample buffer/preview buffer failed.")
return
}

print("captured photo...")
loadImage(data: outputData)
}
}

/// Creates a UIImage from Data object received from AVCapturePhotoOutput
/// delegate callback and sends to the VideoFeedDelegate for handling.
///
/// - Parameter data: Image data.
internal func loadImage(data: Data) {
guard let dataProvider = CGDataProvider(data: data as CFData), let cgImageRef: CGImage = CGImage(jpegDataProviderSource: dataProvider, decode: nil, shouldInterpolate: true, intent: .defaultIntent) else {
return
}
let image = UIImage(cgImage: cgImageRef, scale: 1.0, orientation: UIImageOrientation.right)
delegate?.processVideoSnapshot(image)
}
}

extension VideoFeed: AVCaptureFileOutputRecordingDelegate {

func fileOutput(_ output: AVCaptureFileOutput, didStartRecordingTo fileURL: URL, from connections: [AVCaptureConnection]) {
print("Video recording started: \(fileURL.absoluteString)")
}

func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) {

guard error == nil else {
print("Error recording movie: \(String(describing: error))")
return
}

UISaveVideoAtPathToSavedPhotosAlbum(outputFileURL.path, nil, nil, nil)
}
}

For anyone else making use of this, don't forget to add permissions to your info.plist for access to the camera, photo library and microphone.

<key>NSCameraUsageDescription</key>
<string>Let us use your camera</string>
<key>NSPhotoLibraryAddUsageDescription</key>
<string>save to images</string>
<key>NSMicrophoneUsageDescription</key>
<string>for sound in video</string>

iOS AVFoundation Video Capture Orientation Options

To answer your question, yes, the image sensor is just oriented that way. The video camera is an approx 1-megapixel "1080p" camera that has a fixed orientation. The 5MP (or 8MP for 4S, etc) still camera also has a fixed orientation. The lenses themselves don't rotate nor do any of the other camera bits, and hence the feed itself has a fixed orientation.

"But wait!", you say, "pictures I take with the camera app (or API) get rotated correctly. Why is that?" That's cuz iOS takes a look at the orientation of the phone when a picture is taken and stores that information with the picture (as an Exif attachment). Yet video isn't so flagged -- and each frame would have to be individually flagged, and then there's issues about what to do when the user rotates the phone during video....

So, no, you can't ask a video stream or a still image what orientation the phone was in when the video was captured. You can, however, directly ask the phone what orientation it is in now:

UIDeviceOrientation currentOrientation = [UIDevice currentDevice].orientation;

If you do that at the start of video capture (or when you grab a still image from a video feed) you can then use that information to do your own rotation of playback.



Related Topics



Leave a reply



Submit