Stream Aws S3 Hls Videos in iOS Browsers

Streaming s3 private videos

Found a way to do it, you generate secure URL to access

S3GetPreSignedURLRequest * request = [[S3GetPreSignedURLRequest alloc] init];
request.key = fileName;
request.bucket = self.bucket;
[request setExpires:[NSDate dateWithTimeIntervalSinceNow:3600]];

This only works on AWSIOSSDK 1+ not 2

How can I stream MP4 videos from S3 without AVPlayer downloading the files before playing them?

TLDR

AVPlayer does not support 'streaming' (HTTP range requests) as you would define it, so either use an alternative video player that does or use a real media streaming protocol like HLS which is supported by AVPlayer & would start the video before downloading it all.

CloudFront is great for delivery in general but is not truly needed - you may have seen it mentioned due to CloudFront RTMP distributions but they now have been discontinued.



Detailed Answer

S3 supports a concept called byte-range fetches using HTTP range requests - you can verify this by doing a HEAD request to your video file & seeing that the Accept-Ranges header exists with a value set to bytes (or not 'none').

Load your MP4 file in the browser & notice that it can start as soon as you click play. You're also able to move to the end of the video file and yet, you haven't really downloaded the entire video file. HTTP range requests are what allow this mechanism to work. Small chunks of the video can be downloaded as & when the user gets to that part of the video. This saves the file server & the user bandwidth while providing a much better user experience than the client downloading the entire file.

The server would need to support byte-range fetches in the first instance before the client can then decide to make range requests (or not to). The key is that, once the server supports it, it is up to the HTTP client to decide whether it wants to fetch the data in chunks or all in one go.

This isn't really 'streaming' as you know it & are referring to in your question but it is more 'downloading the video from the server in chunks and playing it back' using HTTP 206 Partial Content responses.

You can see this in the Network tab of your browser as a series of multiple 206 responses when seeking in the video. The entire video is not downloaded but the video is streamed from whichever position that you skip to.

image showing multiple HTTP 206 Partial Content responses in the network tab of Google Chrome when loading a video in browser and moving to random points in the video; highlights HTTP range requests



The problem with AVPlayer

Unfortunately, AVPlayer does not support 'streaming' using HTTP range requests & HTTP 206 Partial Content responses. I've verified this manually by creating a demo iOS app in Xcode.

This has nothing to do with S3 - if you stored these files on any other cloud provider or file server, you'd see that the file is still fully loaded before playing.



The possible solutions

Now that the problem is clear, there are 2 solutions.

Using an alternative video player

The easiest solution is to use an alternative video player which does support byte-range fetches. I'm not an expert in iOS development so I sadly can't help in recommending an alternative but I'm sure there'll be a popular library that the industry prefers over the in-built AVPlayer. This would provide you with your (extremely common) definition of 'streaming'.

Using a video streaming protocol

However, if you must use AVPlayer, the solution is to implement true media streaming with a video streaming protocol - true streaming also allows you to leverage features like adaptive bitrate switching, live audio switching, licensing etc.

There are quite a few of these protocols available like DASH (Dynamic Adaptive Streaming over HTTP), SRT (Secure Reliable Transport) & last but not least, HLS (HTTP Live Streaming).

Today, the most widely used streaming protocol on the internet is HLS, created by Apple themselves (hey, maybe the reason to not support range requests is to force you to use the protocol). Apple's own documentation is really wonderful for delving deeper if you are interested.

Without getting too much into protocol detail, HLS will allow playback to start more quickly in general, fast-forwarding can be much quicker & delivers video as it is being watched for the true streaming experience.

To go ahead with HLS:

  1. Use AWS Elemental MediaConvert to convert your MP4 file to HLS format - the resulting output will be 1 (or more) .M3U8 manifest files in addition to .ts media segment file(s)

  2. Upload the resulting output to S3

  3. Point AVPlayer to the .M3U8 file

let asset = AVURLAsset(url: "https://ermiya.s3.eu-west-1.amazonaws.com/videos/video1playlist.m3u8")
let item = AVPlayerItem(asset: asset)
...

  1. Enjoy near-instant loading of the video


CloudFront

In regards to Amazon CloudFront, it isn't required per se & S3 is sufficient in this case but a quick Google search will mention loads of benefits that it provides, especially caching which can help you save on S3 costs later on.



Conclusion

I would go with converting to HLS if you can, as it will yield more possibilities down the line & is a better true streaming experience in general, but using an alternative video player will work just as well due to iOS AVPlayer restrictions.

Whether to use CloudFront or not, will depend on your user base, usage of S3 and other factors.

As you're creating an MVP, I would recommend just doing a batch conversion of your MP4 files to HLS format & not using CloudFront which would add additional complexity to your cloud configuration.

HLS Video not playing in Safari due to cloudfront

Turns out it was a standard camera encoding color space issue due to Apple does not support one of the color primaries (BT.2020) used by our media file.

NOT SUPPORTED

Sample Image

SUPPORTED

Sample Image

so to workaround this issue we have downgraded this HDR into SDR which converted BT.2020 color spacing into BT.709 then it worked in both IOS and Safari

How facebook stream MPEG-DASH Videos in iOS browsers?

Safari (the browser) on iOS does not support media source extensions. Anyone doing adaptive streaming on that platform is using the native HLS implementation in Safari.


<video src="http://example.com/manifest.m3u8"></video>

Both DASH and HLS are just text-based manifests pointing to video files, so you wouldn't convert a DASH file to an HLS file, you would generate them independently. Now that iPhones are support fragmented MP4s you can use the same video files for both DASH and HLS, whereas before you needed different files for each.



Related Topics



Leave a reply



Submit