Core Image Filter Cisourceovercompositing Not Appearing as Expected with Alpha Overlay

Core Image filter CISourceOverCompositing not appearing as expected with alpha overlay

After a lot of back and forth trying different things, (thanks @andy and @Juraj Antas for pushing me in the right direction) I finally have the answer.

So drawing into a Core Graphics context results in the correct appearance, but it requires more resources to draw images using that approach. It seemed the problem was with CISourceOverCompositing, but the problem actually lies with the fact that, by default, Core Image filters work in linear SRGB space whereas Core Graphics works in perceptual RGB space, which explains the different results - sRGB is better at preserving dark blacks and linearSRGB is better at preserving bright whites. So the original code is fine, but you can output the image in a different way to get a different appearance.

You could create a Core Graphics image from the Core Image filter using a Core Image context that performs no color management. This essentially causes it to interpret the color values "incorrectly" as device RGB (since that's the default for no color management), which can cause red from the standard color range to appear as even more red from the wide color range for example. But this addresses the original concern with alpha compositing.

let ciContext = CIContext(options: [.workingColorSpace: NSNull()])
let outputCGImage = ciContext.createCGImage(outputCIImage, from: outputCIImage.extent)

It is probably more desirable to keep color management enabled and specify the working color space to be sRGB. This too resolves the issue and results in "correct" color interpretation. Note if the image being composited were Display P3, you'd need to specify displayP3 as the working color space to preserve the wide colors.

let workingColorSpace = CGColorSpace(name: CGColorSpace.sRGB)!
let ciContext = CIContext(options: [.workingColorSpace: workingColorSpace])
let outputCGImage = ciContext.createCGImage(outputCIImage, from: outputCIImage.extent)

Trouble generating a mask image; CISourceOverCompositing not working as expected

Unfortunately the source code found on the Anonymous Faces Filter Recipe did not compile or work right out of the box. The issue with the CISourceOverCompositing filter can be found with the inputColor1 parameter and the alpha channel. Replace the inputColor1 color with the following and you should be good to go:

@"inputColor1", [CIColor colorWithRed:0.0 green:0.0 blue:0.0 alpha:0.0]

Alpha blending for frame averaging in Core Image

You can accomplish the same as what you are doing via combination of CIColorMatrix and CISourceOverCompositing in a simpler way just by using CIMix filter like this

    func makeCompositeImage(stackImage: CIImage, newImage: CIImage?, count: Double) -> CIImage {
let opacity = 1.0 / count
return newImage?
.applyingFilter("CIMix", parameters: [
kCIInputBackgroundImageKey: stackImage,
kCIInputAmountKey: opacity
]) ?? stackImage
}

Please check out this app I just published: https://apps.apple.com/us/app/filter-magic/id1594986951. It lets you play with every single filter out there.

How can I combine two CIImages with alpha?

You should use CISourceOverCompositing instead of CISourceInCompositing.

Definition of CISourceInCompositing:

Uses the background image to define what to leave in the input image, effectively cropping the input image.

Definition of CISourceOverCompositing

Places the input image over the input background image.

See information and sample outputs for other CoreImage composite operations here: https://developer.apple.com/library/archive/documentation/GraphicsImaging/Reference/CoreImageFilterReference/index.html#//apple_ref/doc/filter/ci/CISourceOverCompositing



Related Topics



Leave a reply



Submit