How to Release a Cgdataprovider in Swift on iOS

How do I release a CGDataProvider in Swift on iOS?

You don't have to release it. The documentation is still targeted towards Objective-C. The compiler error message is correct.

From Using Swift with Cocoa and Objective-C:

Core Foundation objects returned from annotated APIs are automatically memory managed in Swift—you do not need to invoke the CFRetain, CFRelease, or CFAutorelease functions yourself.

CGDataProvider returning null in Swift 4

It's not a problem of Swift 4, but a problem of iOS 11. You may find your code works on iOS 10 simulator.

The original code seemingly works in iOS 10, depending on just a luck.

In this part of the code:

init(width:Int, height:Int) {
self.width = width
self.height = height
let size:Int = width * height * 4
mutableData = CFDataCreateMutable(kCFAllocatorDefault, size)
createImageFromData(width, height: height)
}

The property mutableData is initialized with a CFMutableData of capacity: size, and empty (that is, content-less).

And in iOS 11, the initializer CGDataProvider.init(data:) rejects an empty CFData as it should not be empty as a data provider.

A quick fix would be something like this:

init(width:Int, height:Int) {
self.width = width
self.height = height
let size:Int = width * height * 4
mutableData = CFDataCreateMutable(kCFAllocatorDefault, size)
CFDataSetLength(mutableData, size) //<-set the length of the data
createImageFromData(width, height: height)
}

But I'm not sure other parts of the code would work as expected in iOS 11.

How do I release a CGImageRef in iOS

Your memory issue results from the copied data, as others have stated. But here's another idea: Use Core Graphics's optimized pixel interpolation to calculate the average.

  1. Create a 1x1 bitmap context.
  2. Set the interpolation quality to medium (see later).
  3. Draw your image scaled down to exactly this one pixel.
  4. Read the RGB value from the context's buffer.
  5. (Release the context, of course.)

This might result in better performance because Core Graphics is highly optimized and might even use the GPU for the downscaling.

Testing showed that medium quality seems to interpolate pixels by taking the average of color values. That's what we want here.

Worth a try, at least.

Edit: OK, this idea seemed too interesting not to try. So here's an example project showing the difference. Below measurements were taken with the contained 512x512 test image, but you can change the image if you want.

It takes about 12.2 ms to calculate the average by iterating over all pixels in the image data. The draw-to-one-pixel approach takes 3 ms, so it's roughly 4 times faster. It seems to produce the same results when using kCGInterpolationQualityMedium.

I assume that the huge performance gain is a result from Quartz noticing that it does not have to decompress the JPEG fully but that it can use the lower frequency parts of the DCT only. That's an interesting optimization strategy when composing JPEG compressed pixels with a scale below 0.5. But I'm only guessing here.

Interestingly, when using your method, 70% of the time is spent in CGDataProviderCopyData and only 30% in the pixel data traversal. This hints to a lot of time spent in JPEG decompression.

Pixel Iterating Screenshot Draw-To-One-Pixel Screenshot

Note: Here's a late follow up on the example image above.

CGDataProvider works the first time, returns an empty image the second time

I have found one big issue with the code I posted and fixed it.
First of all, I was getting crashs even if I didn't load the same image twice, but rather more images. Since the issue is related to memory it failed in all sort of weird ways.

The issue with the code is that I am calling: "CGDataProviderRelease(dataProvider);"
I am using the data provider of newImageSource, but I didn't create this dataprovider. That is why I shouldn't release it.
You need to release things only if you created, retained or copied them.

Apart from that my App crash sometime due to low memory, but after fixing this I was able to use the "economy" type where I allocate and release as soon as possible.

Currently I can't see anything else wrong with this specific code.

iOS Swift 5, Auto Release CoreText & CoreGraphics memory

Extracting this code from the for loop into another function does the trick of releasing the associated memory with the loaded CGFont and the copied CharacterSet.



Related Topics



Leave a reply



Submit