Uploading Image with iOS App to Server File Size Is Too Large

Uploading image with IOS app to server file size is too large

The problem is UIImageJPEGRepresentation. It does not retrieve the original JPEG, but rather creates a new JPEG. And when you use a compressionQuality of 1 (presumably to avoid further image quality loss), it creates this new representation with no compression (generally resulting in a file larger than the original).

I would advise using getBytes to retrieve the original asset, rather than round-tripping it through a UIImage and getting the data via UIImageJPEGRepresentation:

ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetsLibraryURL resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *representation = [asset defaultRepresentation];

// I generally would write directly to a `NSOutputStream`, but if you want it in a
// NSData, it would be something like:

NSMutableData *data = [NSMutableData data];

// now loop, reading data into buffer and writing that to our data stream

NSError *error;
long long bufferOffset = 0ll;
NSInteger bufferSize = 10000;
long long bytesRemaining = [representation size];
uint8_t buffer[bufferSize];
while (bytesRemaining > 0) {
NSUInteger bytesRead = [representation getBytes:buffer fromOffset:bufferOffset length:bufferSize error:&error];
if (bytesRead == 0) {
NSLog(@"error reading asset representation: %@", error);
return;
}
bytesRemaining -= bytesRead;
bufferOffset += bytesRead;
[data appendBytes:buffer length:bytesRead];
}

// ok, successfully read original asset;
// do whatever you want with it here

} failureBlock:^(NSError *error) {
NSLog(@"error=%@", error);
}];

--

If you're using the Photos framework introduced in iOS 8, can use PHImageManager to get the image data:

PHFetchResult *result = [PHAsset fetchAssetsWithALAssetURLs:@[assetsLibraryURL] options:nil];
PHAsset *asset = [result firstObject];
if (asset) {
PHImageManager *manager = [PHImageManager defaultManager];
[manager requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
// use `imageData` here
}];
}

Uploading UIImage data to the server getting memory peaks?

A couple of thoughts:

  1. If uploading multiple images, you might want to constrain how many you perform concurrently.

    For example, this is my memory when I issued 20 uploads of roughly 5mb per image, memory spiked up to 113 mb:

    Sample Image

    If, however, I constrained this to only doing no more than 4 at a time (using operation queues), the memory usage was improved, maxing out at 55mb (i.e. roughly 38mb over baseline):

    Sample Image

    Others have suggested that you might consider not doing more than one upload at a time, and that further reduces the peak memory usage further (in my example, peak memory usage was 27mb, only 10mb over baseline), but recognize that you pay a serious performance penalty for that.

  2. If using NSURLSession, if you use at upload task with the fromFile option rather than loading the asset into a NSData, even when doing 4 concurrently it used dramatically less memory, less than 1 mb over baseline:

    Sample Image

  3. I notice that you're using the Xcode gauges to analyze memory usage. I'd suggest using Instruments' Allocations tool (as shown above). The reason I suggest this is not only do I believe it to be more accurate, but more importantly, you seem to have some pattern of memory not falling completely back down to your baseline after performing some actions. This suggests that you might have some leaks. And only Instruments will help you identifying these leaks.

    I'd suggest watching WWDC 2013 video Fixing Memory Issues or WWDC 2012 video iOS App Performance: Memory which illustrate (amongst other things) how to use Instruments to identify memory problems.

  4. I notice that you're extracting the NSData from the UIImage objects. I don't know how you're getting the images, but if they're from your ALAssetsLibrary (or if you downloaded them from another resource), you might want to grab the original asset rather than loading it into a UIImage and then creating a NSData from that. If you use UIImageJPEGRepresentation, you invariably either (a) make the NSData larger than the original asset (making the memory issue worse); or (b) reduce the quality unnecessarily in an effort to make the NSData smaller. Plus you strip out much of the meta data.

    If you have to use UIImageJPEGRepresentation (e.g. it's a programmatically created UIImage), then so be it, do what you have to do. But if you have access to the original digital asset (e.g. see https://stackoverflow.com/a/27709329/1271826 for ALAssetsLibrary example), then do that. Quality may be maximized without making the asset larger than it needs to be.

Bottom line, you first want to make sure that you don't have any memory leaks, and then you want to minimize how much you hold in memory at any given time (not using any NSData if you can).

iOS to php server file upload - what file size image should I allow?

  1. That's up to you. First decide the resolution of the image (e.g. 300x300px? Smaller? Bigger?) and then adjust the file size accordingly. For example, you can certainly be sure that a 300x300px jpeg image will be smaller than 200KB, but if you want a 3,000 x 3,000 px image you need to allow a few MBs.

  2. base64 is a form of encoding that uses only 64 characters and thus will be 1/3 bigger than "normal" (non-encoded) binary data.

  3. UIImagePickerController returns a UIImage object to the delegate. You can get the size of the image accessing the size property of that object (it's a CGSize element).

Way too many images, app size too big

You probably don't need a database or some robust online system for organizing and retrieving your photos, though it's obvious that storing all these files on the device is not going to work.

  1. Purchase webhosting. For $5 a month you can get unlimited online storage and bandwidth. Godaddy, Bluehost, Hostgator, Dreamhost, etc...

  2. Use one of their online file managers, or download a free FTP (file transfer protocol) client like Filezilla.

  3. Using their online file manager, or connecting with FileZilla, just transfer your images onto the webserver.

  4. Then, with a list of URLs in hand linking you to your car images, use the code below to load an image from a URL:

    NSURL *url = [NSURL URLWithString:path];

    NSData *data = [NSData dataWithContentsOfURL:url];

    UIImage *img = [[UIImage alloc] initWithData:data cache:NO];

  5. ** When you're displaying an interface that relies on downloading information or photos, you should be showing some placeholder ("Loading...," a box, or a "loading spinner") where the content will go that stays until the image is downloaded.

Why image size get increased after UIImagePNGRepresentation?

Let's take this example from WWDC 2018 session - 416_ios_memory_deep_dive. If you have 590kb file size image, the dimension of image is 2048 pixels x 1536 pixels. SO the total memory of this image is 10MB (2048 pixels x 1536 pixels x 4 bytes per pixel) For more details you can take a look at this video. https://developer.apple.com/videos/play/wwdc2018/416/

reduce video size in swift/iOS to upload to server

Try this answer for compress video. According to jojaba's answer:

If you are wanting to compress the video for remote sharing and to
keep the original quality for local storage on the iPhone, you should
look into AVAssetExportSession or AVAssetWriter.

Compress Video Without Low Quality

This approach is as per Objective-C though.

You should also consider reading on how iOS manages Assets.

How to compress of reduce the size of an image before uploading to Parse as PFFile? (Swift)

Yes you can use UIImageJPEGRepresentation instead of UIImagePNGRepresentation to reduce your image file size. You can just create an extension UIImage as follow:

Xcode 8.2 • Swift 3.0.2

extension UIImage {
enum JPEGQuality: CGFloat {
case lowest = 0
case low = 0.25
case medium = 0.5
case high = 0.75
case highest = 1
}

/// Returns the data for the specified image in JPEG format.
/// If the image object’s underlying image data has been purged, calling this function forces that data to be reloaded into memory.
/// - returns: A data object containing the JPEG data, or nil if there was a problem generating the data. This function may return nil if the image has no data or if the underlying CGImageRef contains data in an unsupported bitmap format.
func jpeg(_ quality: JPEGQuality) -> Data? {
return UIImageJPEGRepresentation(self, quality.rawValue)
}
}

edit/update:

Xcode 10 Swift 4.2

extension UIImage {
enum JPEGQuality: CGFloat {
case lowest = 0
case low = 0.25
case medium = 0.5
case high = 0.75
case highest = 1
}

/// Returns the data for the specified image in JPEG format.
/// If the image object’s underlying image data has been purged, calling this function forces that data to be reloaded into memory.
/// - returns: A data object containing the JPEG data, or nil if there was a problem generating the data. This function may return nil if the image has no data or if the underlying CGImageRef contains data in an unsupported bitmap format.
func jpeg(_ jpegQuality: JPEGQuality) -> Data? {
return jpegData(compressionQuality: jpegQuality.rawValue)
}
}

Usage:

if let imageData = image.jpeg(.lowest) {
print(imageData.count)
}


Related Topics



Leave a reply



Submit