didOutputSampleBuffer delegate not called
I found the problem of my error! It's because the delegate that was being called has to be created in the same view controller. here is the modified code:
import UIKit
import AVFoundation
import Accelerate
var customPreviewLayer: AVCaptureVideoPreviewLayer?
class ViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate {
var captureSession: AVCaptureSession?
var dataOutput: AVCaptureVideoDataOutput?
//var customPreviewLayer: AVCaptureVideoPreviewLayer?
@IBOutlet weak var camView: UIView!
override func viewWillAppear(animated: Bool) {
super.viewDidAppear(animated)
//setupCameraSession()
}
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
//captureSession?.startRunning()
setupCameraSession()
self.captureSession?.startRunning()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
func setupCameraSession() {
// Session
self.captureSession = AVCaptureSession()
self.captureSession!.sessionPreset = AVCaptureSessionPreset1920x1080
// Capture device
let inputDevice: AVCaptureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
var deviceInput = AVCaptureDeviceInput()
// Device input
//var deviceInput: AVCaptureDeviceInput? = AVCaptureDeviceInput.deviceInputWithDevice(inputDevice, error: error)
do {
deviceInput = try AVCaptureDeviceInput(device: inputDevice)
} catch let error as NSError {
// Handle errors
print(error)
}
if self.captureSession!.canAddInput(deviceInput) {
self.captureSession!.addInput(deviceInput)
}
// Preview
customPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
customPreviewLayer!.frame = camView.bounds
customPreviewLayer?.videoGravity = AVLayerVideoGravityResizeAspect
customPreviewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.Portrait
self.camView.layer.addSublayer(customPreviewLayer!)
print("Cam layer added")
self.dataOutput = AVCaptureVideoDataOutput()
self.dataOutput!.videoSettings = [
String(kCVPixelBufferPixelFormatTypeKey) : Int(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)
]
self.dataOutput!.alwaysDiscardsLateVideoFrames = true
if self.captureSession!.canAddOutput(dataOutput) {
self.captureSession!.addOutput(dataOutput)
}
self.captureSession!.commitConfiguration()
let queue: dispatch_queue_t = dispatch_queue_create("VideoQueue", DISPATCH_QUEUE_SERIAL)
//let delegate = VideoDelegate()
self.dataOutput!.setSampleBufferDelegate(self, queue: queue)
}
func captureOutput(captureOutput: AVCaptureOutput, didOutputSampleBuffer sampleBuffer: CMSampleBufferRef, fromConnection connection: AVCaptureConnection) {
print("buffered")
let imageBuffer: CVImageBufferRef = CMSampleBufferGetImageBuffer(sampleBuffer)!
CVPixelBufferLockBaseAddress(imageBuffer, 0)
let width: size_t = CVPixelBufferGetWidthOfPlane(imageBuffer, 0)
let height: size_t = CVPixelBufferGetHeightOfPlane(imageBuffer, 0)
let bytesPerRow: size_t = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer, 0)
let lumaBuffer: UnsafeMutablePointer = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0)
let grayColorSpace: CGColorSpaceRef = CGColorSpaceCreateDeviceGray()!
let context: CGContextRef = CGBitmapContextCreate(lumaBuffer, width, height, 8, bytesPerRow, grayColorSpace, CGImageAlphaInfo.PremultipliedLast.rawValue)!//problematic
let dstImageFilter: CGImageRef = CGBitmapContextCreateImage(context)!
dispatch_sync(dispatch_get_main_queue(), {() -> Void in
customPreviewLayer!.contents = dstImageFilter as AnyObject
})
}
}
AVCaptureVideoDataOutput captureOutput not being called
You need to define the didOutputSampleBuffer
delegate callback to actually receive the captured frames:
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
print("captured \(sampleBuffer)")
}
p.s. I'm not sure about macOS, but viewWillAppear
may not be a good place to do initialisation because on iOS at least it can be called multiple times.
AVCaptureOutput didOutputSampleBuffer stops getting called
Your problem is actually referenced in the Docs, Specifically;
If your application is causing samples to be dropped by retaining the
provided CMSampleBufferRef objects for too long, but it needs access
to the sample data for a long period of time, consider copying the
data into a new buffer and then releasing the sample buffer (if it was
previously retained) so that the memory it references can be reused.
Essentially, you need to keep the callback operation as simple as possible, and should you need to perform further processing on the frame passed to you in the callback, you need to copy it to a new buffer and perform the processing in the background. Also, Keep in mind that Core Foundation object must explicitly be retained and released.
A further consideration is memory pressure. Frames contain lots of data, retaining too many will cause your app to crash.
captureOutput not being called
You made a mistake in declaration of required sample buffer delegate method:
captureOutput(_:didOutputSampleBuffer:from:)
.
Please check it and make sure it is:
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!)
PS: Pay attention on how parameters of that method are declared. All parameters have '!' which means automatic unwrapping.
AVCaptureDeviceOutput not calling delegate method captureOutput
Your session
is a local variable. Its scope is limited to viewDidLoad
. Since this is a new project, I assume it's safe to say that you're using ARC. In that case that object won't leak and therefore continue to live as it would have done in the linked question, rather the compiler will ensure the object is deallocated before viewDidLoad
exits.
Hence your session isn't running because it no longer exists.
(aside: the self.theImage.image = ...
is unsafe since it performs a UIKit action of the main queue; you probably want to dispatch_async
that over to dispatch_get_main_queue()
)
So, sample corrections:
@implementation YourViewController
{
AVCaptureSession *session;
}
- (void)viewDidLoad {
[super viewDidLoad];
// Initialize AV session
session = [AVCaptureSession new];
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone)
[session setSessionPreset:AVCaptureSessionPreset640x480];
else
/* ... etc ... */
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
NSLog(@"delegate method called");
CGImageRef cgImage = [self imageFromSampleBuffer:sampleBuffer];
dispatch_sync(dispatch_get_main_queue(),
^{
self.theImage.image = [UIImage imageWithCGImage: cgImage ];
CGImageRelease( cgImage );
});
}
Most people advocate using an underscore at the beginning of instance variable names nowadays but I omitted it for simplicity. You can use Xcode's built in refactor tool to fix that up after you've verified that the diagnosis is correct.
I moved the CGImageRelease
inside the block sent to the main queue to ensure its lifetime extends beyond its capture into a UIImage
. I'm not immediately able to find any documentation to confirm that CoreFoundation objects have their lifetime automatically extended when captured in a block.
A function is not getting called inside a delegate method - IOS/Swift
I was able to find a way to get my work done.But couldn't find what is wrong with my previous code. This is how it looks like
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
let seconds: Int64 = 1 * Int64(NSEC_PER_SEC);
let time = dispatch_time(DISPATCH_TIME_NOW, seconds);
dispatch_after(time, dispatch_get_main_queue(), {
//piece of code i want to run
});
}
Related Topics
Swift Computed Properties Cannot Be Used in Init
The Array Value Should Be Sort Like (Alphabetic, Numbers and Special Characters)
iOS Firebase Database Get Key of Value
Ambiguous Use of Registerclass with Swift
Generics Type Constraint VS Inheritance
Making the Map Zoom to User Location and Annotation (Swift 2)
Sync Data Between Two Viewcontrollers to Avoid Creating Same Observer Again
Center Horizontal Uicollectionview Based on Content
How to Parse a Iso 8601 Duration Format in Swift
Onformat String for Nspredicate
Uitabbar Transition Issue Below iOS 11 Swift
Why Clusterannotationformemberannotations in Mkmapview Is Not Called
Scaling Current Dot of Uipagecontrol and Keeping It Centered