Chromakey Video in Arkit

ChromaKey video in ARKit

Figured this out. I was setting my color to key out incorrectly (and even in the wrong place facepalm) and there seems to be a bug that prevents the video from playing unless you delay it a bit. That bug was supposedly fixed but it seems to not be the case.

Here is my corrected and cleaned up code if anyone is interested (EDITED TO INCLUDE TIP FROM @mnuages) :

// Get Video URL and create AV Player
let filePath = Bundle.main.path(forResource: "VIDEO_FILE_NAME", ofType: "VIDEO_FILE_EXTENSION")
let videoURL = NSURL(fileURLWithPath: filePath!)
let player = AVPlayer(url: videoURL as URL)

// Create SceneKit videoNode to hold the spritekit scene.
let videoNode = SCNNode()

// Set geometry of the SceneKit node to be a plane, and rotate it to be flat with the image
videoNode.geometry = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width,
height: imageAnchor.referenceImage.physicalSize.height)
videoNode.eulerAngles = SCNVector3(-Float.pi/2, 0, 0)

//Set the video AVPlayer as the contents of the video node's material.
videoNode.geometry?.firstMaterial?.diffuse.contents = player
videoNode.geometry?.firstMaterial?.isDoubleSided = true

// Alpha transparancy stuff
let chromaKeyMaterial = ChromaKeyMaterial()
chromaKeyMaterial.diffuse.contents = player
videoNode.geometry!.materials = [chromaKeyMaterial]

//video does not start without delaying the player
//playing the video before just results in [SceneKit] Error: Cannot get pixel buffer (CVPixelBufferRef)
DispatchQueue.main.asyncAfter(deadline: .now() + 0.001) {
player.seek(to:CMTimeMakeWithSeconds(1, 1000))
player.play()
}
// Loop video
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: player.currentItem, queue: .main) { _ in
player.seek(to: kCMTimeZero)
player.play()
}

// Add videoNode to ARAnchor
node.addChildNode(videoNode)

// Add ARAnchor node to the root node of the scene
self.imageDetectView.scene.rootNode.addChildNode(node)

And here is the chrome key material

import SceneKit

public class ChromaKeyMaterial: SCNMaterial {

public var backgroundColor: UIColor {
didSet { didSetBackgroundColor() }
}

public var thresholdSensitivity: Float {
didSet { didSetThresholdSensitivity() }
}

public var smoothing: Float {
didSet { didSetSmoothing() }
}

public init(backgroundColor: UIColor = .green, thresholdSensitivity: Float = 0.50, smoothing: Float = 0.001) {

self.backgroundColor = backgroundColor
self.thresholdSensitivity = thresholdSensitivity
self.smoothing = smoothing

super.init()

didSetBackgroundColor()
didSetThresholdSensitivity()
didSetSmoothing()

// chroma key shader is based on GPUImage
// https://github.com/BradLarson/GPUImage/blob/master/framework/Source/GPUImageChromaKeyFilter.m

let surfaceShader =
"""
uniform vec3 c_colorToReplace;
uniform float c_thresholdSensitivity;
uniform float c_smoothing;

#pragma transparent
#pragma body

vec3 textureColor = _surface.diffuse.rgb;

float maskY = 0.2989 * c_colorToReplace.r + 0.5866 * c_colorToReplace.g + 0.1145 * c_colorToReplace.b;
float maskCr = 0.7132 * (c_colorToReplace.r - maskY);
float maskCb = 0.5647 * (c_colorToReplace.b - maskY);

float Y = 0.2989 * textureColor.r + 0.5866 * textureColor.g + 0.1145 * textureColor.b;
float Cr = 0.7132 * (textureColor.r - Y);
float Cb = 0.5647 * (textureColor.b - Y);

float blendValue = smoothstep(c_thresholdSensitivity, c_thresholdSensitivity + c_smoothing, distance(vec2(Cr, Cb), vec2(maskCr, maskCb)));

float a = blendValue;
_surface.transparent.a = a;
"""

//_surface.transparent.a = a;

shaderModifiers = [
.surface: surfaceShader,
]
}

required public init?(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}

//setting background color to be keyed out
private func didSetBackgroundColor() {
//getting pixel from background color
//let rgb = backgroundColor.cgColor.components!.map{Float($0)}
//let vector = SCNVector3(x: rgb[0], y: rgb[1], z: rgb[2])
let vector = SCNVector3(x: 0.0, y: 1.0, z: 0.0)
setValue(vector, forKey: "c_colorToReplace")
}

private func didSetSmoothing() {
setValue(smoothing, forKey: "c_smoothing")
}

private func didSetThresholdSensitivity() {
setValue(thresholdSensitivity, forKey: "c_thresholdSensitivity")
}
}

Color keying video with GPUImage on a SCNPlane in ARKit

try clearing the background and set the scalemode with

backgroundColor = .clear 
scaleMode = .aspectFit

ARKit / SpriteKit - set pixelBufferAttributes to SKVideoNode or make transparent pixels in video (chroma-key effect) another way

The solution is quite simple!
All that needs to be done is to add the video as a child of a SKEffectNode and apply the filter to the SKEffectNode instead of the video itself (the AVVideoComposition is not necessary).
Here is the code I used:

func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
// Create and configure a node for the anchor added to the view's session.
let bialikVideoNode = videoNodeWith(resourceName: "Tsina_05", ofType: "mp4")
bialikVideoNode.size = CGSize(width: kDizengofVideoWidth, height: kDizengofVideoHeight)
bialikVideoNode.anchorPoint = CGPoint(x: 0.5, y: 0.0)

// Make the video background transparent using an SKEffectNode, since chroma-key doesn't work on video
let effectNode = SKEffectNode()
effectNode.addChild(bialikVideoNode)
effectNode.filter = colorCubeFilterForChromaKey(hueAngle: 120)

return effectNode
}

And here is the result as needed:
Sample Image

ARKit add 2D Video flipped by X

I have fixed issue with temporary solution still looking for a better one

Issue is of wall. wall is flipped other side of camera when draw from right to left direction. I have figure it out by setting isDoubleSided = false and by applying a image as diffuse contents which has text and I can see that image is flipped itself.

I have tried many things but this one helps me

  1. Normalize vector
  2. Find the cross between to SCNVectors
  3. If y > 0 I just swapped from and to Value

Code

    let normalizedTO = to.normalized()
let normalizedFrom = from.normalized()
let angleBetweenTwoVectors = normalizedTO.cross(normalizedFrom)

var from = from
var to = to
if angleBetweenTwoVectors.y > 0 {
let temp = from
from = to
to = temp
}

// Inside extension of SCNVector3
func normalized() -> SCNVector3 {
if self.length() == 0 {
return self
}

return self / self.length()
}

func cross(_ vec: SCNVector3) -> SCNVector3 {
return SCNVector3(self.y * vec.z - self.z * vec.y, self.z * vec.x - self.x * vec.z, self.x * vec.y - self.y * vec.x)
}

Hope it is helpful. If anyone know better solution please answer it.

How do you play a video with alpha channel using AVFoundation?

I've came up with two ways of making this possible. Both utilize surface shader modifiers. Detailed information on shader modifiers can be found in Apple Developer Documentation.

Here's an example project I've created.


1. Masking

  1. You would need to create another video that represents a transparency mask. In that video black = fully opaque, white = fully transparent (or any other way you would like to represent transparency, you would just need to tinker the surface shader).

  2. Create a SKScene with this video just like you do in the code you provided and put it into material.transparent.contents (the same material that you put diffuse video contents into)

    let spriteKitOpaqueScene = SKScene(...)
    let spriteKitMaskScene = SKScene(...)
    ... // creating SKVideoNodes and AVPlayers for each video etc

    let material = SCNMaterial()
    material.diffuse.contents = spriteKitOpaqueScene
    material.transparent.contents = spriteKitMaskScene

    let background = SCNPlane(...)
    background.materials = [material]
  3. Add a surface shader modifier to the material. It is going to "convert" black color from the mask video (well, actually red color, since we only need one color component) into alpha.

    let surfaceShader = "_surface.transparent.a = 1 - _surface.transparent.r;"
    material.shaderModifiers = [ .surface: surfaceShader ]

That's it! Now the white color on the masking video is going to be transparent on the plane.

However you would have to take extra care of syncronizing these two videos since AVPlayers will probably get out of sync. Sadly I didn't have time to address that in my example project (yet, I will get back to it when I have time). Look into this question for a possible solution.

Pros:

  • No artifacts (if syncronized)
  • Precise

Cons:

  • Requires two videos instead of one
  • Requires synchronisation of the AVPlayers

2. Chroma keying

  1. You would need a video that has a vibrant color as a background that would represent parts that should be transparent. Usually green or magenta are used.

  2. Create a SKScene for this video like you normally would and put it into material.diffuse.contents.

  3. Add a chroma key surface shader modifier which will cut out the color of your choice and make these areas transparent. I've lent this shader from GPUImage and I don't really know how it actually works. But it seems to be explained in this answer.

     let surfaceShader =
    """
    uniform vec3 c_colorToReplace = vec3(0, 1, 0);
    uniform float c_thresholdSensitivity = 0.05;
    uniform float c_smoothing = 0.0;

    #pragma transparent
    #pragma body

    vec3 textureColor = _surface.diffuse.rgb;

    float maskY = 0.2989 * c_colorToReplace.r + 0.5866 * c_colorToReplace.g + 0.1145 * c_colorToReplace.b;
    float maskCr = 0.7132 * (c_colorToReplace.r - maskY);
    float maskCb = 0.5647 * (c_colorToReplace.b - maskY);

    float Y = 0.2989 * textureColor.r + 0.5866 * textureColor.g + 0.1145 * textureColor.b;
    float Cr = 0.7132 * (textureColor.r - Y);
    float Cb = 0.5647 * (textureColor.b - Y);

    float blendValue = smoothstep(c_thresholdSensitivity, c_thresholdSensitivity + c_smoothing, distance(vec2(Cr, Cb), vec2(maskCr, maskCb)));

    float a = blendValue;
    _surface.transparent.a = a;
    """

    shaderModifiers = [ .surface: surfaceShader ]

    To set uniforms use setValue(:forKey:) method.

    let vector = SCNVector3(x: 0, y: 1, z: 0) // represents float RGB components
    setValue(vector, forKey: "c_colorToReplace")
    setValue(0.3 as Float, forKey: "c_smoothing")
    setValue(0.1 as Float, forKey: "c_thresholdSensitivity")

    The as Float part is important, otherwise Swift is going to cast the value as Double and shader will not be able to use it.

    But to get a precise masking from this you would have to really tinker with the c_smoothing and c_thresholdSensitivity uniforms. In my example project I ended up having a little green rim around the shape, but maybe I just didn't use the right values.

Pros:

  • only one video required
  • simple setup

Cons:

  • possible artifacts (green rim around the border)

How to apply chroma key filter with any color to live camera feed ios?

You've previously asked about my GPUImage framework, so I assume that you're familiar with it. Within that framework are two filters, a GPUImageChromaKeyFilter and a GPUImageChromaKeyBlendFilter. Both will key off of whatever color you specify via the -setColorToReplaceRed:green:blue: method, with a threshold set using the thresholdSensitivity property.

The former filter merely turns areas matching the color within the threshold to an alpha of 0, with the latter actually doing a blend against another image or video source based areas of the input image or video that match. The FilterShowcase example application shows how to do this for green, but you can set the keying color to anything you want.



Related Topics



Leave a reply



Submit