Avcapturestillimageoutput VS Avcapturephotooutput in Swift 3

AVCaptureStillImageOutput vs AVCapturePhotoOutput in Swift 3

AVCaptureStillImageOutput being deprecated means you can keep using it in iOS 10, but:

  • Apple makes no promises as to how long past iOS 10 it'll stay available.
  • as new hardware and software features get added in iOS 10 and beyond, you won't get access to all of them. For example, you can set up AVCaptureStillImageOutput for wide color but it's a lot easier to do wide color with AVCapturePhotoOutput. And for RAW capture or Live Photos, AVCapturePhotoOutput is the only game in town.

If you're happy proceeding despite the deprecation, your issue isn't that outputSettings is removed — it's still there.

Something to be aware of for beta 6 and beyond (though it turns out not to be an issue here): APIs that use NSDictionary without explicit key and value types come into Swift 3 as [AnyHashable: Any] and the Foundation or CoreFoundation types you might use in a dictionary are no longer implicitly bridged to Swift types. (Some of the other questions about beta 6 dictionary conversions might point you in the right direction there.)

However, I'm not getting any compilation errors for setting outputSettings. Whether in your full code or by reducing it to the essential parts for that line:

var stillImageOutput : AVCaptureStillImageOutput?
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput?.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]

...the only warnings I see are about the deprecation.

Modify AVCaptureStillImageOutput to AVCapturePhotoOutput

Apple documentation explains very clear for How to use AVCapturePhotoOutput

These are the steps to capture a photo.

  • Create an AVCapturePhotoOutput object. Use its properties to determine supported capture settings and to enable certain features (for example, whether to capture Live Photos).
  • Create and configure an AVCapturePhotoSettings object to choose features and settings for a specific capture (for example, whether to enable image stabilization or flash).
  • Capture an image by passing your photo settings object to the capturePhoto(with:delegate:) method along with a delegate object implementing the AVCapturePhotoCaptureDelegate protocol. The photo capture output then calls your delegate to notify you of significant events during the capture process.

have this below code on your clickCapture method and don't forgot to confirm and implement to delegate in your class.

let settings = AVCapturePhotoSettings()
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
kCVPixelBufferWidthKey as String: 160,
kCVPixelBufferHeightKey as String: 160,
]
settings.previewPhotoFormat = previewFormat
self.cameraOutput.capturePhoto(with: settings, delegate: self)

if you would like to know the different way to capturing photo from avfoundation check out my previous SO answer

How to use AVCapturePhotoOutput

Updated to Swift 4
Hi it's really easy to use AVCapturePhotoOutput.

You need the AVCapturePhotoCaptureDelegate which returns the CMSampleBuffer.

You can get as well a preview image if you tell the AVCapturePhotoSettings the previewFormat

    class CameraCaptureOutput: NSObject, AVCapturePhotoCaptureDelegate {

let cameraOutput = AVCapturePhotoOutput()

func capturePhoto() {

let settings = AVCapturePhotoSettings()
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
kCVPixelBufferWidthKey as String: 160,
kCVPixelBufferHeightKey as String: 160]
settings.previewPhotoFormat = previewFormat
self.cameraOutput.capturePhoto(with: settings, delegate: self)

}

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
if let error = error {
print(error.localizedDescription)
}

if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
print("image: \(UIImage(data: dataImage)?.size)") // Your Image
}
}
}

For more information visit https://developer.apple.com/reference/AVFoundation/AVCapturePhotoOutput

Note: You have to add the AVCapturePhotoOutput to the AVCaptureSession before taking the picture. So something like: session.addOutput(output), and then: output.capturePhoto(with:settings, delegate:self) Thanks @BigHeadCreations

Unable to use AVCapturePhotoOutput to capture photo swift + xcode

You are almost there.

For Output as AVCapturePhotoOutput

Check out AVCapturePhotoOutput documentation for more help.

These are the steps to capture a photo.

  1. Create an AVCapturePhotoOutput object. Use its properties to
    determine supported capture settings and to enable certain features
    (for example, whether to capture Live Photos).
  2. Create and configure an AVCapturePhotoSettings object to choose
    features and settings for a specific capture (for example, whether
    to enable image stabilization or flash).
  3. Capture an image by passing your photo settings object to the
    capturePhoto(with:delegate:) method along with a delegate object
    implementing the AVCapturePhotoCaptureDelegate protocol. The photo
    capture output then calls your delegate to notify you of significant
    events during the capture process.

have this below code on your clickCapture method and don't forgot to confirm and implement to delegate in your class.

let settings = AVCapturePhotoSettings()
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
kCVPixelBufferWidthKey as String: 160,
kCVPixelBufferHeightKey as String: 160,
]
settings.previewPhotoFormat = previewFormat
self.cameraOutput.capturePhoto(with: settings, delegate: self)

For Output as AVCaptureStillImageOutput

if you intend to snap a photo from video connection. you can follow the below steps.

Step 1: Get the connection

if let videoConnection = stillImageOutput!.connectionWithMediaType(AVMediaTypeVideo) {
// ...
// Code for photo capture goes here...
}

Step 2: Capture the photo

  • Call the captureStillImageAsynchronouslyFromConnection function on
    the stillImageOutput.
  • The sampleBuffer represents the data that is captured.

stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in
// ...
// Process the image data (sampleBuffer) here to get an image file we can put in our captureImageView
})

Step 3: Process the Image Data

  • We will need to to take a few steps to process the image data found in sampleBuffer in order to end up with a UIImage that we can insert into our captureImageView and easily use elsewhere in our app.

if sampleBuffer != nil {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
let dataProvider = CGDataProviderCreateWithCFData(imageData)
let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)
// ...
// Add the image to captureImageView here...
}

Step 4: Save the image

Based on your need either save the image to photos gallery or show that in a image view


For more details check out Create custom camera view guide under Snap a Photo

Value of type 'AVCapturePhotoOutput' has no member 'outputSettings'

The problem is outputSettings is a property on AVCaptureStillImageOutput, not AVCapturePhotoOutput.

AVCaptureStillImageOutput is deprecated in iOS 10, so for iOS 10+, use AVCapturePhotoOutput instead. To set settings using the new API, you can use an AVCapturePhotoSettings object.

let stimageout = AVCapturePhotoOutput()
let settings = AVCapturePhotoSettings()
settings.livePhotoVideoCodecType = .jpeg
stimageout.capturePhoto(with: settings, delegate: self)

Apple's AVCapturePhotoOutput Documentation: https://developer.apple.com/documentation/avfoundation/avcapturephotooutput

Taking photo with custom camera Swift 3

Thanks to Sharpkits I found my solution (This code works for me):

func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {

if let error = error {
print(error.localizedDescription)
}

if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer,
let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {

let imageData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: nil)
let dataProvider = CGDataProvider(data: imageData as! CFData)

let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: CGColorRenderingIntent.absoluteColorimetric)

let image = UIImage(cgImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.right)

let cropedImage = self.cropToSquare(image: image)

let newImage = self.scaleImageWith(cropedImage, and: CGSize(width: 600, height: 600))

print(UIScreen.main.bounds.width)

self.tempImageView.image = newImage
self.tempImageView.isHidden = false

} else {

}
}


Related Topics



Leave a reply



Submit