How to Use Avcapturephotooutput to Capture Photo Swift + Xcode

Unable to use AVCapturePhotoOutput to capture photo swift + xcode

You are almost there.

For Output as AVCapturePhotoOutput

Check out AVCapturePhotoOutput documentation for more help.

These are the steps to capture a photo.

  1. Create an AVCapturePhotoOutput object. Use its properties to
    determine supported capture settings and to enable certain features
    (for example, whether to capture Live Photos).
  2. Create and configure an AVCapturePhotoSettings object to choose
    features and settings for a specific capture (for example, whether
    to enable image stabilization or flash).
  3. Capture an image by passing your photo settings object to the
    capturePhoto(with:delegate:) method along with a delegate object
    implementing the AVCapturePhotoCaptureDelegate protocol. The photo
    capture output then calls your delegate to notify you of significant
    events during the capture process.

have this below code on your clickCapture method and don't forgot to confirm and implement to delegate in your class.

let settings = AVCapturePhotoSettings()
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
kCVPixelBufferWidthKey as String: 160,
kCVPixelBufferHeightKey as String: 160,
]
settings.previewPhotoFormat = previewFormat
self.cameraOutput.capturePhoto(with: settings, delegate: self)

For Output as AVCaptureStillImageOutput

if you intend to snap a photo from video connection. you can follow the below steps.

Step 1: Get the connection

if let videoConnection = stillImageOutput!.connectionWithMediaType(AVMediaTypeVideo) {
// ...
// Code for photo capture goes here...
}

Step 2: Capture the photo

  • Call the captureStillImageAsynchronouslyFromConnection function on
    the stillImageOutput.
  • The sampleBuffer represents the data that is captured.

stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in
// ...
// Process the image data (sampleBuffer) here to get an image file we can put in our captureImageView
})

Step 3: Process the Image Data

  • We will need to to take a few steps to process the image data found in sampleBuffer in order to end up with a UIImage that we can insert into our captureImageView and easily use elsewhere in our app.

if sampleBuffer != nil {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
let dataProvider = CGDataProviderCreateWithCFData(imageData)
let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)
// ...
// Add the image to captureImageView here...
}

Step 4: Save the image

Based on your need either save the image to photos gallery or show that in a image view


For more details check out Create custom camera view guide under Snap a Photo

How to use AVCapturePhotoOutput

Updated to Swift 4
Hi it's really easy to use AVCapturePhotoOutput.

You need the AVCapturePhotoCaptureDelegate which returns the CMSampleBuffer.

You can get as well a preview image if you tell the AVCapturePhotoSettings the previewFormat

    class CameraCaptureOutput: NSObject, AVCapturePhotoCaptureDelegate {

let cameraOutput = AVCapturePhotoOutput()

func capturePhoto() {

let settings = AVCapturePhotoSettings()
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
kCVPixelBufferWidthKey as String: 160,
kCVPixelBufferHeightKey as String: 160]
settings.previewPhotoFormat = previewFormat
self.cameraOutput.capturePhoto(with: settings, delegate: self)

}

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
if let error = error {
print(error.localizedDescription)
}

if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
print("image: \(UIImage(data: dataImage)?.size)") // Your Image
}
}
}

For more information visit https://developer.apple.com/reference/AVFoundation/AVCapturePhotoOutput

Note: You have to add the AVCapturePhotoOutput to the AVCaptureSession before taking the picture. So something like: session.addOutput(output), and then: output.capturePhoto(with:settings, delegate:self) Thanks @BigHeadCreations

SWIFT 3: Capture photo with AVCapturePhotoOutput (Need another set of eyes to look over code, why isn't this working?)

Alright, figured it out. El Tomato was on the right track with the problem child, but it wasn't the right prescription. My createCamera() function was set to private which of course makes the contents not visible outside its body. So while I was calling the correct AVCapturePhotoOutput(), the buffer feed didn't exist for the capturePhoto() call to execute...throwing the error described.

So this means the line:

cameraPhotoOutput.capturePhoto(with: photoSettings, delegate: self)

was correct, but it was simply the incorrect setup into execution. To confirm proper execution I...

  • changed my private let photoOutput = AVCapturePhotoOutput() constant
  • to private let cameraPhotoOutput = AVCapturePhotoOutput()
  • and called that constant directly in private func createCamera()

which immediately executed an image capture flawlessly.

Also replacing cameraPhotoOutput, an AVCapturePhotoOutput(), with cameraOutput, an AVCapturePhotoOutput!, was tried and simply reproduced the error.

If you are interested: the cgImage creation process stayed the same in the func capture(_ : capture... function. Within its bounds, I also determined the camera device's position, changed the image's orientation if front camera, and dispatched on the main queue the photo over to a var photoContent: UIImage? variable on the ReviewViewController.

Hope my mental error helps someone else :-)

SwiftUI AVCapturePhotoOutput Does Not Work

The main problem is that you create a PhotoDelegate but do not store it. In iOS, the delegate object is usually stored as a weak reference to prevent a circular reference / retain cycle.

You can fix this by simply creating another property in your view, but instead I suggest you create a model class. If you're doing something unrelated to the view itself, that's a sign that you're better off moving it to some other place, like ObservableObject. You can also make it your delegate, so you don't have to create a separate object and use a singleton: that's another sign that you're doing something wrong.

class CaptureModel: NSObject, ObservableObject {
let captureSession = AVCaptureSession()
var backCamera: AVCaptureDevice?
var frontCamera: AVCaptureDevice?
var photoOutput: AVCapturePhotoOutput?
var currentCamera: AVCaptureDevice?
@Published
var capturedImage: UIImage?

override init() {
super.init()
setupCaptureSession()
setupDevices()
setupInputOutput()
}

func setupCaptureSession() {
captureSession.sessionPreset = AVCaptureSession.Preset.photo
}//setupCaptureSession

func setupDevices() {

let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [AVCaptureDevice.DeviceType.builtInWideAngleCamera], mediaType: .video, position: .unspecified)

let devices = deviceDiscoverySession.devices
for device in devices {
if device.position == AVCaptureDevice.Position.back {
backCamera = device
} else if device.position == AVCaptureDevice.Position.front {
frontCamera = device
}//if else
}//for in

currentCamera = frontCamera

}//setupDevices

func setupInputOutput() {

do {
//you only get here if there is a camera ( ! ok )
let captureDeviceInput = try AVCaptureDeviceInput(device: currentCamera!)
captureSession.addInput(captureDeviceInput)
photoOutput = AVCapturePhotoOutput()
photoOutput?.setPreparedPhotoSettingsArray([AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg])], completionHandler: {(success, error) in
})
captureSession.addOutput(photoOutput!)
captureSession.commitConfiguration()

} catch {
print("Error creating AVCaptureDeviceInput:", error)
}

}//setupInputOutput

func startRunningCaptureSession() {
let settings = AVCapturePhotoSettings()

captureSession.startRunning()
photoOutput?.capturePhoto(with: settings, delegate: self)
}//startRunningCaptureSession

func stopRunningCaptureSession() {
captureSession.stopRunning()
}//startRunningCaptureSession
}

extension CaptureModel: AVCapturePhotoCaptureDelegate {
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
guard let data = photo.fileDataRepresentation(),
let image = UIImage(data: data) else {
return
}
capturedImage = image
}
}

struct ContentView: View {
@StateObject
var model = CaptureModel()

var body: some View {
VStack {
Text("Take a Photo Automatically")
.padding()

ZStack {
RoundedRectangle(cornerRadius: 0)
.stroke(Color.blue, lineWidth: 4)
.frame(width: 320, height: 240, alignment: .center)

model.capturedImage.map { capturedImage in
Image(uiImage: capturedImage)
}
}

Spacer()
}
.onAppear {
if UIImagePickerController.isSourceTypeAvailable(.camera) {
model.startRunningCaptureSession()
} else {
print("No Camera is Available")
}
}
.onDisappear {
model.stopRunningCaptureSession()
}
}
}//struct

Pass image from photoOutput(didFinishProcessingPhoto) in PhotoCaptureDelegate.swift to my ViewController

You initialize the PhotoCaptureProcessor with several callback blocks that get called at different stages of the capture process. In the completionHandler you should be able to access the captured photo from the photoCaptureProcessor. Something like this:

let photoCaptureProcessor = PhotoCaptureProcessor(with: photoSettings, willCapturePhotoAnimation: {
// ...
}, livePhotoCaptureHandler: { capturing in
// ...
}, completionHandler: { photoCaptureProcessor in
let capturedImage = UIImage(data: photoCaptureProcessor.photoData)
// be sure to perform UI changes in main thread
DispatchQueue.main.async {
// assuming the image view is named like that...
self.imageView.image = capturedImage
}

// ...
}, photoProcessingHandler: { animate in
// ...
})


Related Topics



Leave a reply



Submit