iOS - How to Play a Video with Transparency

Play video with transparent background IOS

Here is a link with a real solution to this problem playing-movies-with-an-alpha-channel-on-the-ipad. Her blog post is talking about video on the iPad, but the same library with alpha channel support also works on iPhone. It is also possible to encode to h.264 with alpha channel support using the method described in the blog post comments.

How do you play a video with alpha channel using AVFoundation?

I've came up with two ways of making this possible. Both utilize surface shader modifiers. Detailed information on shader modifiers can be found in Apple Developer Documentation.

Here's an example project I've created.


1. Masking

  1. You would need to create another video that represents a transparency mask. In that video black = fully opaque, white = fully transparent (or any other way you would like to represent transparency, you would just need to tinker the surface shader).

  2. Create a SKScene with this video just like you do in the code you provided and put it into material.transparent.contents (the same material that you put diffuse video contents into)

    let spriteKitOpaqueScene = SKScene(...)
    let spriteKitMaskScene = SKScene(...)
    ... // creating SKVideoNodes and AVPlayers for each video etc

    let material = SCNMaterial()
    material.diffuse.contents = spriteKitOpaqueScene
    material.transparent.contents = spriteKitMaskScene

    let background = SCNPlane(...)
    background.materials = [material]
  3. Add a surface shader modifier to the material. It is going to "convert" black color from the mask video (well, actually red color, since we only need one color component) into alpha.

    let surfaceShader = "_surface.transparent.a = 1 - _surface.transparent.r;"
    material.shaderModifiers = [ .surface: surfaceShader ]

That's it! Now the white color on the masking video is going to be transparent on the plane.

However you would have to take extra care of syncronizing these two videos since AVPlayers will probably get out of sync. Sadly I didn't have time to address that in my example project (yet, I will get back to it when I have time). Look into this question for a possible solution.

Pros:

  • No artifacts (if syncronized)
  • Precise

Cons:

  • Requires two videos instead of one
  • Requires synchronisation of the AVPlayers

2. Chroma keying

  1. You would need a video that has a vibrant color as a background that would represent parts that should be transparent. Usually green or magenta are used.

  2. Create a SKScene for this video like you normally would and put it into material.diffuse.contents.

  3. Add a chroma key surface shader modifier which will cut out the color of your choice and make these areas transparent. I've lent this shader from GPUImage and I don't really know how it actually works. But it seems to be explained in this answer.

     let surfaceShader =
    """
    uniform vec3 c_colorToReplace = vec3(0, 1, 0);
    uniform float c_thresholdSensitivity = 0.05;
    uniform float c_smoothing = 0.0;

    #pragma transparent
    #pragma body

    vec3 textureColor = _surface.diffuse.rgb;

    float maskY = 0.2989 * c_colorToReplace.r + 0.5866 * c_colorToReplace.g + 0.1145 * c_colorToReplace.b;
    float maskCr = 0.7132 * (c_colorToReplace.r - maskY);
    float maskCb = 0.5647 * (c_colorToReplace.b - maskY);

    float Y = 0.2989 * textureColor.r + 0.5866 * textureColor.g + 0.1145 * textureColor.b;
    float Cr = 0.7132 * (textureColor.r - Y);
    float Cb = 0.5647 * (textureColor.b - Y);

    float blendValue = smoothstep(c_thresholdSensitivity, c_thresholdSensitivity + c_smoothing, distance(vec2(Cr, Cb), vec2(maskCr, maskCb)));

    float a = blendValue;
    _surface.transparent.a = a;
    """

    shaderModifiers = [ .surface: surfaceShader ]

    To set uniforms use setValue(:forKey:) method.

    let vector = SCNVector3(x: 0, y: 1, z: 0) // represents float RGB components
    setValue(vector, forKey: "c_colorToReplace")
    setValue(0.3 as Float, forKey: "c_smoothing")
    setValue(0.1 as Float, forKey: "c_thresholdSensitivity")

    The as Float part is important, otherwise Swift is going to cast the value as Double and shader will not be able to use it.

    But to get a precise masking from this you would have to really tinker with the c_smoothing and c_thresholdSensitivity uniforms. In my example project I ended up having a little green rim around the shape, but maybe I just didn't use the right values.

Pros:

  • only one video required
  • simple setup

Cons:

  • possible artifacts (green rim around the border)

Cannot play transparent video in iOS AVPlayer, what can I do?

Your video needs to encoded with an Alpha Channel. ProRes 4444 will do that, as well as the recent HEVC format.
Also remember, the transparent area will have the colour of your window or view area. If you put your clip into an AVPlayerView, it will have a black BG.
Essentially, if you don't encode your alpha channel into the video, as per your example, you would need to use a CIImage filter to do live "Green Screening" which will require a load more processing power.

Using AVPlayer to view transparent video

The iOS SDK does not properly support alpha channel video playback. That applies for AVFramework as well as the MediaPlayer Framework. Video material that contains an alpha channel will not work as you expect it to when using Apple's API.

And as you actually show within your code, AVPlayer does not use a UIView as its surface for playing videos but a subclass of CALayer, AVLayer.

You will need to rethink your application design or chose a different playback SDK.



Related Topics



Leave a reply



Submit