How to Save Cgimage to Data in Swift

How to save CGImage to Data in Swift?

The problem is the way you are creating your mutable data. Data is not convertible to NSMutableData. Just change your forced casting from Data to CFMutableData to CFDataCreateMutable(nil, 0). Another option is to use NSMutableData which is toll-free bridged to CFMutableData. Try like this:

if let cgi = cgi, 
let mutableData = CFDataCreateMutable(nil, 0),
let destination = CGImageDestinationCreateWithData(mutableData, "public.png" as CFString, 1, nil) {
CGImageDestinationAddImage(destination, cgi, nil)
if CGImageDestinationFinalize(destination) {
let data = mutableData as Data
if let image = UIImage(data: data) {
print(image.size)
}
} else {
print("Error writing Image")
}
}

edit/update: Xcode 11 • Swift 5.1

extension CGImage {
var png: Data? {
guard let mutableData = CFDataCreateMutable(nil, 0),
let destination = CGImageDestinationCreateWithData(mutableData, "public.png" as CFString, 1, nil) else { return nil }
CGImageDestinationAddImage(destination, self, nil)
guard CGImageDestinationFinalize(destination) else { return nil }
return mutableData as Data
}
}

Saving CGImageRef to a png file?

Create a CGImageDestination, passing kUTTypePNG as the type of file to create. Add the image, then finalize the destination.

SWIFT 3 - CGImage to PNG

Thanks the issue of DonMag in my other question SWIFT 3 - CGImage copy always nil, here is the code to solve this :

func saveImageWithAlpha(theImage: UIImage, destFile: URL) -> Void {

// odd but works... solution to image not saving with proper alpha channel
UIGraphicsBeginImageContext(theImage.size)
theImage.draw(at: CGPoint.zero)
let saveImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()

if let img = saveImage, let data = UIImagePNGRepresentation(img) {

try? data.write(to: destFile)

}

}

How to use SDWebImage to download and save CGImage

SDWebImage is an asynchronous library. You can’t just return the results. Generally one would use an @escaping closure to supply the results to the caller. E.g.

func downloadImage(completion: @escaping(CGImage?) -> Void) {
let url = URL(string: "https://my-url-here.com")!

SDWebImageDownloader.shared.downloadImage(with: url) { image, _, _, _ in
completion(image?.cgImage)
}
}

And you’d use it like:

downloadImage { image in
guard let image = image else { return }

// use image here
}

// but not here

But let’s step back and look at the whole pattern. You say you want to “save” the result. If you’re talking about saving it to persistent storage, you would not want to use CGImage (or UIImage or whatever) at all. That’s computationally inefficient (converting asset to image and then back to Data so you can save it), space inefficient (you have to load the whole asset into memory at the same time), and likely introduces problems (e.g. if you download a JPG, convert to CGImage and then try to recreate a JPG, the resulting asset will be slightly different, bigger, and/or with new JPG artifacts). If you’re just pre-downloading assets, just use a simple networking library like Alamofire or URLSession.

Converting NSData to CGImage and then back to NSData makes the file too big

I was doing some image manipulation and came across your question on SO. Seems like no one else came up with an answer, so here's my theory.

While it's theoretically possible to convert a CGImageRef back to NSData in the manner that you described, the data itself is invalid and not a real JPEG or PNG, as you discovered by it not being readable. So I don't think that the NSData.length is correct. You have to actually jump through a number of steps to recreate an NSData representation of a CGImageRef:

// incoming image data
NSData *image;

// create the image ref
CGDataProviderRef imgDataProvider = CGDataProviderCreateWithCFData((__bridge CFDataRef) image);
CGImageRef imageRef = CGImageCreateWithJPEGDataProvider(imgDataProvider, NULL, true, kCGRenderingIntentDefault);

// image metadata properties (EXIF, GPS, TIFF, etc)
NSDictionary *properties;

// create the new output data
CFMutableDataRef newImageData = CFDataCreateMutable(NULL, 0);
// my code assumes JPEG type since the input is from the iOS device camera
CFStringRef type = UTTypeCreatePreferredIdentifierForTag(kUTTagClassMIMEType, (__bridge CFStringRef) @"image/jpg", kUTTypeImage);
// create the destination
CGImageDestinationRef destination = CGImageDestinationCreateWithData(newImageData, type, 1, NULL);
// add the image to the destination
CGImageDestinationAddImage(destination, imageRef, (__bridge CFDictionaryRef) properties);
// finalize the write
CGImageDestinationFinalize(destination);

// memory cleanup
CGDataProviderRelease(imgDataProvider);
CGImageRelease(imageRef);
CFRelease(type);
CFRelease(destination);

NSData *newImage = (__bridge_transfer NSData *)newImageData;

With these steps, the newImage.length should be the same as image.length. I haven't tested since I actually do cropping between the input and the output, but based on the crop, the size is roughly what I expected (the output is roughly half the pixels of the input and thus the output length roughly half the size of the input length).



Related Topics



Leave a reply



Submit