iOS Tap to Focus

iOS tap to focus

 device.focusPointOfInterest = focusPoint
device.focusMode = AVCaptureFocusMode.AutoFocus
device.exposurePointOfInterest = focusPoint
device.exposureMode = AVCaptureExposureMode.ContinuousAutoExposure

I don't why this works, but it did.

ios AVFoundation tap to focus

You have to adjust the touchPoint to a range of [0,1] using something like the following code:

    CGRect screenRect = [[UIScreen mainScreen] bounds];
screenWidth = screenRect.size.width;
screenHeight = screenRect.size.height;
double focus_x = thisFocusPoint.center.x/screenWidth;
double focus_y = thisFocusPoint.center.y/screenHeight;

[[self captureManager].videoDevice lockForConfiguration:&error];
[[self captureManager].videoDevice setFocusPointOfInterest:CGPointMake(focus_x,focus_y)];
[[self captureManager].videoDevice unlockForConfiguration];

The documentation on this can be found in Apple - AV Foundation Programming Guidelines - see section Media Capture where you will find information on Focus Modes:

If it’s supported, you set the focal point using focusPointOfInterest. You pass a CGPoint where {0,0} represents the top left of the picture area, and {1,1} represents the bottom right in landscape mode with the home button on the right—this applies even if the device is in portrait mode.

Set Camera Focus On Tap Point With Swift

Turns out its very simple:

_currentDevice.lockForConfiguration(nil)
_currentDevice.focusPointOfInterest = tap.locationInView(self)
_currentDevice.unlockForConfiguration()

Tap to focus to auto focus when content in camera view changed. Logic like in the stock Camera app or the UIImagePickerController for iOS?

This sounds very complex at first to me... but it turned super simple, Apple already did the 99% of work for us. All you need to do is to set "subjectAreaChangeMonitoringEnabled" on and register KVO on "AVCaptureDeviceSubjectAreaDidChangeNotification"! On the iOS 6.1 docs:

The value of this property indicates whether the receiver should
monitor the video subject area for changes, such as lighting changes,
substantial movement, and so on. If subject area change monitoring is
enabled, the capture device object sends an
AVCaptureDeviceSubjectAreaDidChangeNotification whenever it detects a
change to the subject area, at which time an interested client may
wish to re-focus, adjust exposure, white balance, etc.

Before changing the value of this property, you must call
lockForConfiguration: to acquire exclusive access to the device’s
configuration properties. If you do not, setting the value of this
property raises an exception. When you are done configuring the
device, call unlockForConfiguration to release the lock and allow
other devices to configure the settings.

You can observe changes to the value of this property using key-value
observing.

(Even better, you don't need to handle many corner cases. What if the device is in the middle of "adjustingFocus" at a POI and the content changed? You don't want the device fall back to auto focus at the center, and want the focus action to finish. The "area did change notification" is only triggered after the focus is done.)

Some sample code snippet from my project. (The structure follows the official AVFoundation example AVCam, so you can put them in easily and try out):

// CameraCaptureManager.m

@property (nonatomic, strong) AVCaptureDevice *backFacingCamera;

- (id) init{
self = [super init];
if (self){

// TODO: more of your setup code for AVFoundation capture session
for (AVCaptureDevice *device in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) {
if (device.position == AVCaptureDevicePositionBack){
self.backFacingCamera = device;
}
}

NSNotificationCenter *notificationCenter = [NSNotificationCenter defaultCenter];

void (^subjectAreaDidChangeBlock)(NSNotification *) = ^(NSNotification *notification) {

if (self.videoInput.device.focusMode == AVCaptureFocusModeLocked ){
// All you need to do is set the continuous focus at the center. This is the same behavior as
// in the stock Camera app
[self continuousFocusAtPoint:CGPointMake(.5f, .5f)];
}
};

self.subjectAreaDidChangeObserver = [notificationCenter addObserverForName:AVCaptureDeviceSubjectAreaDidChangeNotification
object:nil
queue:nil
usingBlock:subjectAreaDidChangeBlock];

[[UIDevice currentDevice] beginGeneratingDeviceOrientationNotifications];
[self addObserver:self forKeyPath:keyPathAdjustingFocus options:NSKeyValueObservingOptionNew context:NULL];
}

return self;
}

-(void) dealloc{
// Remove the observer when done
NSNotificationCenter *notificationCenter = [NSNotificationCenter defaultCenter];
[notificationCenter removeObserver:self.deviceOrientationDidChangeObserver];
}

- (BOOL) setupSession{
BOOL sucess = NO;

if ([self.backFacingCamera lockForConfiguration:nil]){
// Turn on subject area change monitoring
self.backFacingCamera.subjectAreaChangeMonitoringEnabled = YES;
}

[self.backFacingCamera unlockForConfiguration];

// TODO: Setup add input etc...

return sucess;
}

Tap To Focus And Exposure Doesn't Work

You made minor mistakes in your code, the lines below should fix it.

 cameraDevice.focusPointOfInterest = focusPoint
cameraDevice.focusMode = AVCaptureFocusMode.AutoFocus
cameraDevice.exposurePointOfInterest = focusPoint
cameraDevice.exposureMode = AVCaptureExposureMode.ContinuousAutoExposure

Create animation for tap to focus using SwiftyCams didFocusAtPoint function

The DemoSwiftyCam project on Github already has an implementation for this feature:

ViewController.swift

func swiftyCam(_ swiftyCam: SwiftyCamViewController, didFocusAtPoint point: CGPoint) {
print("Did focus at point: \(point)")
focusAnimationAt(point)
}

...

extension ViewController {

///...

fileprivate func focusAnimationAt(_ point: CGPoint) {
let focusView = UIImageView(image: #imageLiteral(resourceName: "focus")) // Image Available in DemoSwiftyCam Assets.xcassets
focusView.center = point
focusView.alpha = 0.0
view.addSubview(focusView)

UIView.animate(withDuration: 0.25, delay: 0.0, options: .curveEaseInOut, animations: {
focusView.alpha = 1.0
focusView.transform = CGAffineTransform(scaleX: 1.25, y: 1.25)
}) { (success) in
UIView.animate(withDuration: 0.15, delay: 0.5, options: .curveEaseInOut, animations: {
focusView.alpha = 0.0
focusView.transform = CGAffineTransform(translationX: 0.6, y: 0.6)
}) { (success) in
focusView.removeFromSuperview()
}
}
}

AVFoundation tap to focus feedback rectangle

Here's what I did:
This is the class that creates the square that is shown when the user taps on the camera overlay.

CameraFocusSquare.h

#import <UIKit/UIKit.h>
@interface CameraFocusSquare : UIView
@end

CameraFocusSquare.m

#import "CameraFocusSquare.h"
#import <QuartzCore/QuartzCore.h>

const float squareLength = 80.0f;
@implementation FBKCameraFocusSquare

- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// Initialization code

[self setBackgroundColor:[UIColor clearColor]];
[self.layer setBorderWidth:2.0];
[self.layer setCornerRadius:4.0];
[self.layer setBorderColor:[UIColor whiteColor].CGColor];

CABasicAnimation* selectionAnimation = [CABasicAnimation
animationWithKeyPath:@"borderColor"];
selectionAnimation.toValue = (id)[UIColor blueColor].CGColor;
selectionAnimation.repeatCount = 8;
[self.layer addAnimation:selectionAnimation
forKey:@"selectionAnimation"];

}
return self;
}
@end

And in the view where you receive your taps, do the following:

- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchPoint = [touch locationInView:touch.view];
[self focus:touchPoint];

if (camFocus)
{
[camFocus removeFromSuperview];
}
if ([[touch view] isKindOfClass:[FBKVideoRecorderView class]])
{
camFocus = [[CameraFocusSquare alloc]initWithFrame:CGRectMake(touchPoint.x-40, touchPoint.y-40, 80, 80)];
[camFocus setBackgroundColor:[UIColor clearColor]];
[self addSubview:camFocus];
[camFocus setNeedsDisplay];

[UIView beginAnimations:nil context:NULL];
[UIView setAnimationDuration:1.5];
[camFocus setAlpha:0.0];
[UIView commitAnimations];
}
}

- (void) focus:(CGPoint) aPoint;
{
Class captureDeviceClass = NSClassFromString(@"AVCaptureDevice");
if (captureDeviceClass != nil) {
AVCaptureDevice *device = [captureDeviceClass defaultDeviceWithMediaType:AVMediaTypeVideo];
if([device isFocusPointOfInterestSupported] &&
[device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
CGRect screenRect = [[UIScreen mainScreen] bounds];
double screenWidth = screenRect.size.width;
double screenHeight = screenRect.size.height;
double focus_x = aPoint.x/screenWidth;
double focus_y = aPoint.y/screenHeight;
if([device lockForConfiguration:nil]) {
[device setFocusPointOfInterest:CGPointMake(focus_x,focus_y)];
[device setFocusMode:AVCaptureFocusModeAutoFocus];
if ([device isExposureModeSupported:AVCaptureExposureModeAutoExpose]){
[device setExposureMode:AVCaptureExposureModeAutoExpose];
}
[device unlockForConfiguration];
}
}
}
}


Related Topics



Leave a reply



Submit