Fairplay Streaming: Calling Copypixelbufferforitemtime on Avplayeritemvideooutput Returns Null

FairPlay Streaming: Calling copyPixelBufferForItemTime on AVPlayerItemVideoOutput returns NULL

After researching into this issue further, I have come to the conclusion that Apple engineered their AVPlayer in such a way that once you use FairPlay protected HLS the only exit point (see A:) for the decrypted data copyPixelBufferForItemTime always returns nil.

  • I have tested the exact same stream without encryption and copyPixelBufferForItemTime returns a reference to the pixel buffer as expected.
  • I have tested the exact same stream with encryption on an AVPlayerLayer and it displays the video as excepted.

It appears that once you use FairPlay, the only way to display your protected video content is by using an AVPlayerLayer. There appears to be no way as of today to retrieve FairPlay protected HLS media from Apple's APIs in order to display it on an OpenGL texture in 3D space for example.

A: copyPixelBufferForItemTime being the only exit point since calling renderInContext on an AVPlayerLayer doesn't work

Fairplay stops working on iOS 12.4 and 13

Finally I've found a root cause of my problem. My old code has next line:

resourceLoader.preloadsEligibleContentKeys = YES;

and it worked OK before.

But from 12.4 it breaks playing encrypted streams from my tests. Without setting this flag all works fine: playing online/offline content and downloading so it's weird but this flag is out-of-use on new iOS versions.

iOS: AVPlayer - getting a snapshot of the current frame of a video

AVPlayerItemVideoOutput works fine for me from an m3u8. Maybe it's because I don't consult hasNewPixelBufferForItemTime and simply call copyPixelBufferForItemTime? This code produces a CVPixelBuffer instead of a UIImage, but there are answers that describe how to do that.

This answer mostly cribbed from here

#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>

@interface ViewController ()

@property (nonatomic) AVPlayer *player;
@property (nonatomic) AVPlayerItem *playerItem;
@property (nonatomic) AVPlayerItemVideoOutput *playerOutput;

@end

@implementation ViewController
- (void)setupPlayerWithLoadedAsset:(AVAsset *)asset {
NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
self.playerOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
[self.playerItem addOutput:self.playerOutput];
self.player = [AVPlayer playerWithPlayerItem:self.playerItem];

AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
playerLayer.frame = self.view.frame;
[self.view.layer addSublayer:playerLayer];

[self.player play];
}

- (IBAction)grabFrame {
CVPixelBufferRef buffer = [self.playerOutput copyPixelBufferForItemTime:[self.playerItem currentTime] itemTimeForDisplay:nil];
NSLog(@"The image: %@", buffer);
}

- (void)viewDidLoad {
[super viewDidLoad];

NSURL *someUrl = [NSURL URLWithString:@"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];

[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:^{

NSError* error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
if (status == AVKeyValueStatusLoaded)
{
dispatch_async(dispatch_get_main_queue(), ^{
[self setupPlayerWithLoadedAsset:asset];
});
}
else
{
NSLog(@"%@ Failed to load the tracks.", self);
}
}];
}

@end

iOS: AVPlayer - getting a snapshot of the current frame of a video

AVPlayerItemVideoOutput works fine for me from an m3u8. Maybe it's because I don't consult hasNewPixelBufferForItemTime and simply call copyPixelBufferForItemTime? This code produces a CVPixelBuffer instead of a UIImage, but there are answers that describe how to do that.

This answer mostly cribbed from here

#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>

@interface ViewController ()

@property (nonatomic) AVPlayer *player;
@property (nonatomic) AVPlayerItem *playerItem;
@property (nonatomic) AVPlayerItemVideoOutput *playerOutput;

@end

@implementation ViewController
- (void)setupPlayerWithLoadedAsset:(AVAsset *)asset {
NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
self.playerOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
[self.playerItem addOutput:self.playerOutput];
self.player = [AVPlayer playerWithPlayerItem:self.playerItem];

AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
playerLayer.frame = self.view.frame;
[self.view.layer addSublayer:playerLayer];

[self.player play];
}

- (IBAction)grabFrame {
CVPixelBufferRef buffer = [self.playerOutput copyPixelBufferForItemTime:[self.playerItem currentTime] itemTimeForDisplay:nil];
NSLog(@"The image: %@", buffer);
}

- (void)viewDidLoad {
[super viewDidLoad];

NSURL *someUrl = [NSURL URLWithString:@"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];

[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:^{

NSError* error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
if (status == AVKeyValueStatusLoaded)
{
dispatch_async(dispatch_get_main_queue(), ^{
[self setupPlayerWithLoadedAsset:asset];
});
}
else
{
NSLog(@"%@ Failed to load the tracks.", self);
}
}];
}

@end

Creating FairPlay Streaming (FPS) streams without using mediafilesegmenter

I found a tool that does this fantastically: bento4

https://www.bento4.com/

The docs are top-notch and the output is perfect.



Related Topics



Leave a reply



Submit