Ios: Avplayer - Getting a Snapshot of the Current Frame of a Video

iOS: AVPlayer - getting a snapshot of the current frame of a video

AVPlayerItemVideoOutput works fine for me from an m3u8. Maybe it's because I don't consult hasNewPixelBufferForItemTime and simply call copyPixelBufferForItemTime? This code produces a CVPixelBuffer instead of a UIImage, but there are answers that describe how to do that.

This answer mostly cribbed from here

#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>

@interface ViewController ()

@property (nonatomic) AVPlayer *player;
@property (nonatomic) AVPlayerItem *playerItem;
@property (nonatomic) AVPlayerItemVideoOutput *playerOutput;

@end

@implementation ViewController
- (void)setupPlayerWithLoadedAsset:(AVAsset *)asset {
NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
self.playerOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
[self.playerItem addOutput:self.playerOutput];
self.player = [AVPlayer playerWithPlayerItem:self.playerItem];

AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
playerLayer.frame = self.view.frame;
[self.view.layer addSublayer:playerLayer];

[self.player play];
}

- (IBAction)grabFrame {
CVPixelBufferRef buffer = [self.playerOutput copyPixelBufferForItemTime:[self.playerItem currentTime] itemTimeForDisplay:nil];
NSLog(@"The image: %@", buffer);
}

- (void)viewDidLoad {
[super viewDidLoad];

NSURL *someUrl = [NSURL URLWithString:@"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];

[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:^{

NSError* error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
if (status == AVKeyValueStatusLoaded)
{
dispatch_async(dispatch_get_main_queue(), ^{
[self setupPlayerWithLoadedAsset:asset];
});
}
else
{
NSLog(@"%@ Failed to load the tracks.", self);
}
}];
}

@end

iOS: AVPlayer - getting a snapshot of the current frame of a video

AVPlayerItemVideoOutput works fine for me from an m3u8. Maybe it's because I don't consult hasNewPixelBufferForItemTime and simply call copyPixelBufferForItemTime? This code produces a CVPixelBuffer instead of a UIImage, but there are answers that describe how to do that.

This answer mostly cribbed from here

#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>

@interface ViewController ()

@property (nonatomic) AVPlayer *player;
@property (nonatomic) AVPlayerItem *playerItem;
@property (nonatomic) AVPlayerItemVideoOutput *playerOutput;

@end

@implementation ViewController
- (void)setupPlayerWithLoadedAsset:(AVAsset *)asset {
NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
self.playerOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
[self.playerItem addOutput:self.playerOutput];
self.player = [AVPlayer playerWithPlayerItem:self.playerItem];

AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
playerLayer.frame = self.view.frame;
[self.view.layer addSublayer:playerLayer];

[self.player play];
}

- (IBAction)grabFrame {
CVPixelBufferRef buffer = [self.playerOutput copyPixelBufferForItemTime:[self.playerItem currentTime] itemTimeForDisplay:nil];
NSLog(@"The image: %@", buffer);
}

- (void)viewDidLoad {
[super viewDidLoad];

NSURL *someUrl = [NSURL URLWithString:@"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];

[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:^{

NSError* error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
if (status == AVKeyValueStatusLoaded)
{
dispatch_async(dispatch_get_main_queue(), ^{
[self setupPlayerWithLoadedAsset:asset];
});
}
else
{
NSLog(@"%@ Failed to load the tracks.", self);
}
}];
}

@end

Accesing Individual Frames using AV Player

You can enumerate the frames of your video serially using AVAssetReader, like this:

let asset = AVAsset(URL: inputUrl)
let reader = try! AVAssetReader(asset: asset)

let videoTrack = asset.tracksWithMediaType(AVMediaTypeVideo)[0]

// read video frames as BGRA
let trackReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings:[String(kCVPixelBufferPixelFormatTypeKey): NSNumber(unsignedInt: kCVPixelFormatType_32BGRA)])

reader.addOutput(trackReaderOutput)
reader.startReading()

while let sampleBuffer = trackReaderOutput.copyNextSampleBuffer() {
print("sample at time \(CMSampleBufferGetPresentationTimeStamp(sampleBuffer))")
if let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
// process each CVPixelBufferRef here
// see CVPixelBufferGetWidth, CVPixelBufferLockBaseAddress, CVPixelBufferGetBaseAddress, etc
}
}

Random access is more complicated. You could use an AVPlayer + AVPlayerItemVideoOutput to get frames from any time t, using copyPixelBufferForItemTime, as described in this answer, but the subtlety lies in how you choose that t.

If you want to sample the video at uniform intervals, then that's easy, but if you want to land on the same frames/presentation time stamps that the serial AVAssetReader code sees, then you will probably have to preprocess the file with AVAssetReader, to build a frame number -> presentation timestamp map. This can be fast if you skip decoding by using nil output settings in AVAssetReaderTrackOutput.



Related Topics



Leave a reply



Submit