iOS 10 Breaks Custom Cifilter

iOS 10 breaks custom CIFilter

It looks like some part (the player layer?) of the iOS10 (device) pipeline has switched to YUV.

Setting your AVPlayerLayer's pixelBufferAttributes to BGRA fixes the lack of alpha and silences the logged error:

layer.pixelBufferAttributes = [kCVPixelBufferPixelFormatTypeKey as String: NSNumber(value: kCVPixelFormatType_32BGRA)]

Using custom CIFilter on CALayer shows no change to CALayer

As @DonMag points out it should have worked with the changes he described. How ever unfortunately we heard back from Apple today;

At this time, there is a bug preventing custom CIFilters on a CALayer from working. There is no workaround for this bug at this time.

When we file the bug I will add the link here for those interested. But at this time you can not add a custom CIFilter to a CALayer on macOS 11.

Let’s hope they fix it for all of you reading this for a solution.


EDIT:

So bad news... currently on macOS 12.2.1, and it still has the same issue, nothing has happened based on our ticket. Doesn't seem like Apple want's to fix this. For those of you out there looking: This still does NOT work on a CALayer even with all the options on like described in the other answers. A builtin CIFilter works as expected.

Note that using the same custom CIFilter on a CALayer for an export using AVVideoCompositionCoreAnimationTool does work!

CIFilter applied to an image not working - swift

It's in the comments, but here's the full (and better formatted) answer on how to set up a call to CIHueAdjust, using Swift 4:

let filter = CIFilter(name: "CIHueAdjust")
let context = CIContext()
var extent: CGRect!
var scaleFactor: CGFloat!

@IBOutlet weak var img: UIImageView!

override func viewDidLoad() {
super.viewDidLoad()

let ciImage = CIImage(image: img.image!)

// Note: you may use kCIInputImageKey for inputImage
filter?.setValue(ciImage, forKey: "inputImage")
filter?.setValue(Float(1), forKey: "inputAngle")
let result = filter?.outputImage

var image = UIImage(cgImage: context.createCGImage(result!, from: result!.extent)!)
img.image = image

}

How to create simple custom filter for iOS using Core Image Framework?

OUTDATED

You can't create your own custom kernels/filters in iOS yet. See http://developer.apple.com/library/mac/#documentation/graphicsimaging/Conceptual/CoreImaging/ci_intro/ci_intro.html, specifically:

Although this document is included in the reference library, it has
not been updated in detail for iOS 5.0. A forthcoming revision will
detail the differences in Core Image on iOS. In particular, the key
difference is that Core Image on iOS does not include the ability to
create custom image filters
.

(Bolding mine)

CIFilter GaussianBlur seems to be broken on iOS9.x (used with SKEffectNode)

I fiddled a bit more with this and found a solution...

First of all, it seems that using odd numbers for the blur radius causes the entire node to be rendered with an offset (???) so using 10 for example fixed the offset issue.

Secondly, it seems that the blur is cropped since the entire node is the rendered sprite and for a blur effect you need an extra space so I use a transparent sprite for the extra space and the following code snippet now works:

let glowEffectNode = SKEffectNode()
glowEffectNode.shouldRasterize = true

let glowBackgroundSize = CGSize(width: barSize.width + 60, height: barSize.height + 60)
let glowSize = CGSize(width: barSize.width + 10, height: barSize.height + 10)
let glowEffectSprite = SKSpriteNode(color: barColorData.topColor, size: glowSize)
glowEffectNode.addChild(SKSpriteNode(color: SKColor.clearColor(), size: glowBackgroundSize))
glowEffectNode.addChild(glowEffectSprite)

let glowFilter = CIFilter(name: "CIGaussianBlur")
glowFilter!.setDefaults()
glowFilter!.setValue(10, forKey: "inputRadius")

glowEffectNode.filter = glowFilter

I should have mentioned that I am creating a texture from this node using view.textureFromNode(glowEffectNode) for efficiency purposes but I tried using the node itself and the problem was still there so the above should work regardless

Using normalized sampler coordinates in CIFilter kernel

You can translate the source coordinates into relative values using the extent of the source like this:

#include <metal_stdlib>
using namespace metal;
#include <CoreImage/CoreImage.h>

extern "C" { namespace coreimage {
float4 dyeInThree(sampler src, float3 redVector, float3 greenVector, float3 blueVector) {

float2 pos = src.coord();
float4 pixelColor = src.sample(pos);

// transform to relative coordinates
pos -= src.origin();
pos /= src.size();

float3 color = float3(0,0,0);
if (pos.y < 0.33) {
color = pixelColor.rgb * redVector;
} else if (pos.y < 0.66) {
color = pixelColor.rgb * greenVector;
} else {
color = pixelColor.rgb * blueVector;
}

return float4(color, pixelColor.a);
}
}}


Related Topics



Leave a reply



Submit