Avcapturevideopreviewlayer Is Not Visible on The Screenshot

AVCaptureVideoPreviewLayer is not visible on the screenshot

I was in the same position, and researched two separate solutions to this problem.

  1. Set up the ViewController as an AVCaptureVideoDataOutputSampleBufferDelegate and sample the video output to take the screenshot.

  2. Set up the ViewController as an AVCapturePhotoCaptureDelegate and capture the photo.

The mechanism for setting up the former is described in this question for example: How to take UIImage of AVCaptureVideoPreviewLayer instead of AVCapturePhotoOutput capture

I implemented both to check if there was any difference in the quality of the image (there wasn't).

If all you need is the camera snapshot, then that's it. But it sounds like you need to draw an additional animation on top. For this, I created a container UIView of the same size as the snapshot, added a UIImageView to it with the snapshot and then drew the animation on top. After that you can use UIGraphicsGetImageFromCurrentImageContext on the container.

As for which of solutions (1) and (2) to use, if you don't need to support different camera orientations in the app, it probably doesn't matter. However, if you need to switch between front and back camera and support different camera orientations, then you need to know the snapshot orientation to apply the animation in the right place, and getting that right turned out to be a total bear with method (1).

The solution I used:

  1. UIViewController extends AVCapturePhotoCaptureDelegate

  2. Add photo output to the AVCaptureSession

    private let session = AVCaptureSession()
private let photoOutput = AVCapturePhotoOutput()

....

// When configuring the session
if self.session.canAddOutput(self.photoOutput) {
self.session.addOutput(self.photoOutput)
self.photoOutput.isHighResolutionCaptureEnabled = true
}

  1. Capture snapshot
    let settings = AVCapturePhotoSettings()
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [
kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
kCVPixelBufferWidthKey as String: 160,
kCVPixelBufferHeightKey as String: 160
]
settings.previewPhotoFormat = previewFormat
photoOutput.capturePhoto(with: settings, delegate: self)

  1. Rotate or flip the snapshot before doing the rest
    func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {

guard error == nil else {
// Do something
// return
}

if let dataImage = photo.fileDataRepresentation() {
print(UIImage(data: dataImage)?.size as Any)

let dataProvider = CGDataProvider(data: dataImage as CFData)
let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)
//https://developer.apple.com/documentation/uikit/uiimageorientation?language=objc
let orientation = UIApplication.shared.statusBarOrientation
var imageOrientation = UIImage.Orientation.right
switch orientation {
case .portrait:
imageOrientation = self.cameraPosition == .back ? UIImage.Orientation.right : UIImage.Orientation.leftMirrored
case .landscapeRight:
imageOrientation = self.cameraPosition == .back ? UIImage.Orientation.up : UIImage.Orientation.downMirrored
case .portraitUpsideDown:
imageOrientation = self.cameraPosition == .back ? UIImage.Orientation.left : UIImage.Orientation.rightMirrored
case .landscapeLeft:
imageOrientation = self.cameraPosition == .back ? UIImage.Orientation.down : UIImage.Orientation.upMirrored
case .unknown:
imageOrientation = self.cameraPosition == .back ? UIImage.Orientation.right : UIImage.Orientation.leftMirrored
@unknown default:
imageOrientation = self.cameraPosition == .back ? UIImage.Orientation.right : UIImage.Orientation.leftMirrored
}
let image = UIImage.init(cgImage: cgImageRef, scale: 1.0, orientation: imageOrientation)

// Do whatever you need to do with the image

} else {
// Handle error
}
}

If you need to know the size of the image to position the animations you can use the AVCaptureVideoDataOutputSampleBufferDelegate strategy to detect the size of the buffer once.

AVCaptureVideoPreviewLayer: taking a snapshot

facing the same woes, from a slightly different angle.

Here are possible solutions, that none are too great IMO:

  • You can add to an AVCaptureSession both an AVCaptureStillImageOutput and an AVCaptureVideoDataOutput. When you set the sessionPreset to AVCaptureSessionPresetHigh you'll start getting frames by the API, and when you switch to AVCaptureSessionPresetPhoto you can take real images. So right before taking the picture, you can switch to video, get a frame, and then return to camera. Major caveat is that it takes a "long" time (couple of seconds) for the camera to switch between the video camera and picture camera.

  • Another option would be to use only the camera output (AVCaptureStillImageOutput), and use UIGetScreenImage to get a screen capture of the phone. You could then crop out the controls and leave only the image. This gets complicated if you're showing UI controls over the image. Also, according to this post, Apple started rejecting apps that use this function (it was always iffy).

  • Aside from these I also tried playing with AVCaptureVideoPreviewLayer. There's this post to save a UIView or CALayer to a UIImage. But it all produces clear or white images. I tried accessing the layer, the view's layer, the superlayer, the presentationLayer, the modelLayer, but to no avail. I guess the data in AVCaptureVideoPreviewLayer is very internal, and not really part of the regular layer infrastructure.

Hope this helps,
Oded.

ios objective C screenshot sublayer not visible

You should use AVCaptureStillImageOutput to get image from the camera connection,

Here is how you could do it,

    AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
stillImageOutput.outputSettings = @{
AVVideoCodecKey: AVVideoCodecJPEG,
(__bridge id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)
};
[stillImageOutput captureStillImageAsynchronouslyFromConnection:connection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [UIImage imageWithData:imageData];
}];

Can't get screenshot of only the UIView which shows camera (AVCapturePhotoOutput) in Swift

Image from capture session

let stillImageOutput = AVCaptureStillImageOutput()

stillImageOutput.outputSettings = [AVVideoCodecKey:AVVideoCodecJPEG]
if captureSession.canAddOutput(stillImageOutput) {
captureSession.addOutput(stillImageOutput)
}

func captureImage() {
let videoConnection = stillImageOutput.connection(withMediaType: AVMediaTypeVideo)
stillImageOutput.captureStillImageAsynchronously(from: videoConnection, completionHandler: {(ingDataBuffer, error) in
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(ingDataBuffer)
self.imgView.image = UIImage(data: imageData!)
})
}

Screenshot of UIView

UIGraphicsBeginImageContextWithOptions(cameraView.bounds.size, false, 0);
cameraView.layer.render(in: UIGraphicsGetCurrentContext()!)
let image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

AVCapturePhotoCaptureDelegate

Set delegate to self

func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: NSError?) {

if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
print(image: UIImage(data: dataImage).size)
}

}

How to take UIImage of AVCaptureVideoPreviewLayer instead of AVCapturePhotoOutput capture

Basically instead of using AVCaptureVideoPreviewLayer for grabbing frames you should use AVCaptureVideoDataOutputSampleBufferDelegate.
Here is example:

import Foundation
import UIKit
import AVFoundation

protocol CaptureManagerDelegate: class {
func processCapturedImage(image: UIImage)
}

class CaptureManager: NSObject {
internal static let shared = CaptureManager()
weak var delegate: CaptureManagerDelegate?
var session: AVCaptureSession?

override init() {
super.init()
session = AVCaptureSession()

//setup input
let device = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
let input = try! AVCaptureDeviceInput(device: device)
session?.addInput(input)

//setup output
let output = AVCaptureVideoDataOutput()
output.videoSettings = [kCVPixelBufferPixelFormatTypeKey as AnyHashable: kCVPixelFormatType_32BGRA]
output.setSampleBufferDelegate(self, queue: DispatchQueue.main)
session?.addOutput(output)
}

func statSession() {
session?.startRunning()
}

func stopSession() {
session?.stopRunning()
}

func getImageFromSampleBuffer(sampleBuffer: CMSampleBuffer) ->UIImage? {
guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
return nil
}
CVPixelBufferLockBaseAddress(pixelBuffer, .readOnly)
let baseAddress = CVPixelBufferGetBaseAddress(pixelBuffer)
let width = CVPixelBufferGetWidth(pixelBuffer)
let height = CVPixelBufferGetHeight(pixelBuffer)
let bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer)
let colorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedFirst.rawValue | CGBitmapInfo.byteOrder32Little.rawValue)
guard let context = CGContext(data: baseAddress, width: width, height: height, bitsPerComponent: 8, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo: bitmapInfo.rawValue) else {
return nil
}
guard let cgImage = context.makeImage() else {
return nil
}
let image = UIImage(cgImage: cgImage, scale: 1, orientation:.right)
CVPixelBufferUnlockBaseAddress(pixelBuffer, .readOnly)
return image
}
}

extension CaptureManager: AVCaptureVideoDataOutputSampleBufferDelegate {
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
guard let outputImage = getImageFromSampleBuffer(sampleBuffer: sampleBuffer) else {
return
}
delegate?.processCapturedImage(image: outputImage)
}
}

Update: To process images you should implement a processCapturedImage method of the CaptureManagerDelegate protocol in any other class where you want, like:

import UIKit

class ViewController: UIViewController {
@IBOutlet weak var imageView: UIImageView!
override func viewDidLoad() {
super.viewDidLoad()
CaptureManager.shared.statSession()
CaptureManager.shared.delegate = self
}
}

extension ViewController: CaptureManagerDelegate {
func processCapturedImage(image: UIImage) {
self.imageView.image = image
}
}

Getting only white screenshot

I used AVCaptureStillImageOutput and it worked.

    -(void)captureNow {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in _stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection)
{
break;
}
}

[_stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
img = [[UIImage alloc] initWithData:imageData];
//I used image in another viewcontroller
[MainViewController setResultTexts:str img:img];
[MainViewController set_from_view:0 scanner:1];
[self dismissViewControllerAnimated:YES completion:nil];

}];

}


Related Topics



Leave a reply



Submit