How to Get a Low Res Image, or Thumbnail from the Alassetrepresentation in Swift

How to get a low res image, or Thumbnail from the ALAssetRepresentation in Swift

Here's an example (there might be some minor compilation issues, depending what version of Swift you are using):

let src = CGImageSourceCreateWithURL(url, nil)
let scale = UIScreen.mainScreen().scale
let w = // desired display width, multiplied by scale
let d : [NSObject:AnyObject] = [
kCGImageSourceShouldAllowFloat : true,
kCGImageSourceCreateThumbnailWithTransform : true,
kCGImageSourceCreateThumbnailFromImageAlways : true,
kCGImageSourceThumbnailMaxPixelSize : w
]
let imref = CGImageSourceCreateThumbnailAtIndex(src, 0, d)
let im = UIImage(CGImage: imref, scale: scale, orientation: .Up)!

However, note that you should not be using ALAssetsLibrary any longer. It is deprecated in iOS 9. Switch to Photo Kit, and welcome to the modern world! Now you can call PHImageManager.defaultManager().requestImageForAsset, which allows you to supply a targetSize for the desired image.

get thumbnail images in list and full size image when click on the item of list

To load thumbnail image.

Got the solution after doing too much stuff, may be it can help to others. Below are the steps to do this.

Step 1 : Declare object of PHFetchResult

var Galleryimages: PHFetchResult<PHAsset>!

Step 2 : Fetch results from gallery using below code:

func grabPhotos(){

Galleryimages = PHAsset.fetchAssets(with: PHAssetMediaType.image, options: nil)

}

Step 3 : Show the thumbnail images in your UI (collectionview/Tableview) using below code :

let imageview = cell.viewWithTag(1) as! UIImageView

PHImageManager.default().requestImage(for: (Galleryimages?[indexPath.row])!, targetSize: CGSize(width: 200, height: 200), contentMode: .aspectFill, options: nil) { (image: UIImage?, info: [AnyHashable: Any]?) -> Void in
imageview.image = image
}

Step 4 : And finally get the full size image using below code.

let options = PHImageRequestOptions()
options.deliveryMode = .highQualityFormat
options.resizeMode = .exact

PHImageManager.default().requestImage(for: (Galleryimages[indexPath.row]), targetSize: PHImageManagerMaximumSize, contentMode: .aspectFill, options: options) { (image: UIImage?, info: [AnyHashable: Any]?) -> Void in
if let image = image {
//Use this originan image
}
}

Generating custom thumbnail from ALAssetRepresentation

You can use CGImageSourceCreateThumbnailAtIndex to create a small image from a potentially-large image source. You can load your image from disk using the ALAssetRepresentation's getBytes:fromOffset:length:error: method, and use that to create a CGImageSourceRef.

Then you just need to pass the kCGImageSourceThumbnailMaxPixelSize and kCGImageSourceCreateThumbnailFromImageAlways options to CGImageSourceCreateThumbnailAtIndex with the image source you've created, and it will create a smaller version for you without loading the huge version into memory.

I've written a blog post and gist with this technique fleshed out in full.

ALAsset defaultRepresentation fullResolutionImage

So far I figured out only one way to get what I want. All assets store their modification (like filters, crops and etc) info in the metadata dictionary by the key @"AdjustmentXMP". We're able to interpret this data and apply all filters to the fullResolutionImage like in this SO answer. Here is my complete solution:

...
ALAssetRepresentation *assetRepresentation = [asset defaultRepresentation];
CGImageRef fullResImage = [assetRepresentation fullResolutionImage];
NSString *adjustment = [[assetRepresentation metadata] objectForKey:@"AdjustmentXMP"];
if (adjustment) {
NSData *xmpData = [adjustment dataUsingEncoding:NSUTF8StringEncoding];
CIImage *image = [CIImage imageWithCGImage:fullResImage];

NSError *error = nil;
NSArray *filterArray = [CIFilter filterArrayFromSerializedXMP:xmpData
inputImageExtent:image.extent
error:&error];
CIContext *context = [CIContext contextWithOptions:nil];
if (filterArray && !error) {
for (CIFilter *filter in filterArray) {
[filter setValue:image forKey:kCIInputImageKey];
image = [filter outputImage];
}
fullResImage = [context createCGImage:image fromRect:[image extent]];
}
}
UIImage *result = [UIImage imageWithCGImage:fullResImage
scale:[assetRepresentation scale]
orientation:(UIImageOrientation)[assetRepresentation orientation]];

Fetching thumbnails from PHAsset

It does the job twice for some reason

Because that's what it's supposed to do. Fetching images from the photo library takes time, as you've discovered. Therefore, the default behavior is we supply a low-resolution image as quickly as possible, just so you have something to display; we then call again, possibly several times, with better-quality versions of the image. Moreover, the fetch is being formed asynchronously. Thus, it is perfectly possible to end up with multiple fetch requests happening at once, which can cause your code to start stumbling over its own feet, as you've discovered.

If you don't like that, set the fetch options synchronous to true — but then you must make this whole call on a background queue! By doing this in a serial queue you can make sure that the calls are performed separately in turn, and each image will be delivered just once. Moreover, then (and only then) your PHImageRequestOptions.DeliveryModeFastFormat will be obeyed. Don't forget to step back out to the main thread before doing anything with the image you receive!



Related Topics



Leave a reply



Submit