Performance issues when cropping UIImage (CoreGraphics, iOS)
originalImageView is an IBOutlet ImageView. This image will be cropped.
#import <QuartzCore/QuartzCore.h>
QuartzCore is needed for the white border around each slice for better understanding.
-(UIImage*)getCropImage:(CGRect)cropRect
{
CGImageRef image = CGImageCreateWithImageInRect([originalImageView.image CGImage],cropRect);
UIImage *cropedImage = [UIImage imageWithCGImage:image];
CGImageRelease(image);
return cropedImage;
}
-(void)prepareSlices:(uint)row:(uint)col
{
float flagX = originalImageView.image.size.width / originalImageView.frame.size.width;
float flagY = originalImageView.image.size.height / originalImageView.frame.size.height;
float _width = originalImageView.frame.size.width / col;
float _height = originalImageView.frame.size.height / row;
float _posX = 0.0;
float _posY = 0.0;
for (int i = 1; i <= row * col; i++) {
UIImageView *croppedImageVeiw = [[UIImageView alloc] initWithFrame:CGRectMake(_posX, _posY, _width, _height)];
UIImage *img = [self getCropImage:CGRectMake(_posX * flagX,_posY * flagY, _width * flagX, _height * flagY)];
croppedImageVeiw.image = img;
croppedImageVeiw.layer.borderColor = [[UIColor whiteColor] CGColor];
croppedImageVeiw.layer.borderWidth = 1.0f;
[self.view addSubview:croppedImageVeiw];
[croppedImageVeiw release];
_posX += _width;
if (i % col == 0) {
_posX = 0;
_posY += _height;
}
}
originalImageView.alpha = 0.0;
}
originalImageView.alpha = 0.0; you won't see the originalImageView any more.
Call it like this:
[self prepareSlices:4 :4];
It should make 16 slices addSubView on self.view. We have a puzzle app. This is working code from there.
cropping(to: CGRect) doesn't crop correctly
Are you using one of the scaling content modes in your image view? If so, the dimensions of the image different from the dimensions of the image view and you have two options:
You could just resize your image to match the dimensions of your image view before attempting the crop. Then a standard cropping routine would work. But that can result in either a loss of resolution or the introduction of pixelation.
The better solution is to transform the cropping rectangle to the coordinates of the image dimensions before cropping.
For example:
extension UIImageView {
func image(at rect: CGRect) -> UIImage? {
guard
let image = image,
let rect = convertToImageCoordinates(rect)
else {
return nil
}
return image.cropped(to: rect)
}
func convertToImageCoordinates(_ rect: CGRect) -> CGRect? {
guard let image = image else { return nil }
let imageSize = CGSize(width: image.size.width, height: image.size.height)
let imageCenter = CGPoint(x: imageSize.width / 2, y: imageSize.height / 2)
let imageViewRatio = bounds.width / bounds.height
let imageRatio = imageSize.width / imageSize.height
let scale: CGPoint
switch contentMode {
case .scaleToFill:
scale = CGPoint(x: imageSize.width / bounds.width, y: imageSize.height / bounds.height)
case .scaleAspectFit:
let value: CGFloat
if imageRatio < imageViewRatio {
value = imageSize.height / bounds.height
} else {
value = imageSize.width / bounds.width
}
scale = CGPoint(x: value, y: value)
case .scaleAspectFill:
let value: CGFloat
if imageRatio > imageViewRatio {
value = imageSize.height / bounds.height
} else {
value = imageSize.width / bounds.width
}
scale = CGPoint(x: value, y: value)
case .center:
scale = CGPoint(x: 1, y: 1)
// unhandled cases include
// case .redraw:
// case .top:
// case .bottom:
// case .left:
// case .right:
// case .topLeft:
// case .topRight:
// case .bottomLeft:
// case .bottomRight:
default:
fatalError("Unexpected contentMode")
}
var rect = rect
if rect.width < 0 {
rect.origin.x += rect.width
rect.size.width = -rect.width
}
if rect.height < 0 {
rect.origin.y += rect.height
rect.size.height = -rect.height
}
return CGRect(x: (rect.minX - bounds.midX) * scale.x + imageCenter.x,
y: (rect.minY - bounds.midY) * scale.y + imageCenter.y,
width: rect.width * scale.x,
height: rect.height * scale.y)
}
}
Now, I'm only handling four of the possible content modes, and if you want to handle more, you'd have to implement those yourself. But hopefully this illustrates the pattern, namely convert the selection CGRect
into coordinates within the image before attempting the crop.
FWIW, this is the cropping method I use, cropped(to:)
from https://stackoverflow.com/a/28513086/1271826, using the more contemporary UIGraphicsImageRenderer
, using CoreGraphics cropping if I can, etc. But use whatever cropping routine you want, but just make sure to transform the coordinates to something suitable for the image.
How to crop a UIImageView to a new UIImage in 'aspect fill' mode?
Let's divide the problem into two parts:
Given the size of a UIImageView and the size of its UIImage, if the UIImageView's content mode is Aspect Fill, what is the part of the UIImage that fits into the UIImageView? We need, in effect, to crop the original image to match what the UIImageView is actually displaying.
Given an arbitrary rect within the UIImageView, what part of the cropped image (derived in part 1) does it correspond to?
The first part is the interesting part, so let's try it. (The second part will then turn out to be trivial.)
Here's the original image I'll use:
https://static1.squarespace.com/static/54e8ba93e4b07c3f655b452e/t/56c2a04520c64707756f4267/1455596221531/
That image is 1000x611. Here's what it looks like scaled down (but keep in mind that I'm going to be using the original image throughout):
My image view, however, will be 139x182, and is set to Aspect Fill. When it displays the image, it looks like this:
The problem we want to solve is: what part of the original image is being displayed in my image view, if my image view is set to Aspect Fill?
Here we go. Assume that iv
is the image view:
let imsize = iv.image!.size
let ivsize = iv.bounds.size
var scale : CGFloat = ivsize.width / imsize.width
if imsize.height * scale < ivsize.height {
scale = ivsize.height / imsize.height
}
let croppedImsize = CGSize(width:ivsize.width/scale, height:ivsize.height/scale)
let croppedImrect =
CGRect(origin: CGPoint(x: (imsize.width-croppedImsize.width)/2.0,
y: (imsize.height-croppedImsize.height)/2.0),
size: croppedImsize)
So now we have solved the problem: croppedImrect
is the region of the original image that is showing in the image view. Let's proceed to use our knowledge, by actually cropping the image to a new image matching what is shown in the image view:
let r = UIGraphicsImageRenderer(size:croppedImsize)
let croppedIm = r.image { _ in
iv.image!.draw(at: CGPoint(x:-croppedImrect.origin.x, y:-croppedImrect.origin.y))
}
The result is this image (ignore the gray border):
But lo and behold, that is the correct answer! I have extracted from the original image exactly the region portrayed in the interior of the image view.
So now you have all the information you need. croppedIm
is the UIImage actually displayed in the clipped area of the image view. scale
is the scale between the image view and that image. Therefore, you can easily solve the problem you originally proposed! Given any rectangle imposed upon the image view, in the image view's bounds coordinates, you simply apply the scale (i.e. divide all four of its attributes by scale
) — and now you have the same rectangle as a portion of croppedIm
.
(Observe that we didn't really need to crop the original image to get croppedIm
; it was sufficient, in reality, to know how to perform that crop. The important information is the scale
along with the origin
of croppedImRect
; given that information, you can take the rectangle imposed upon the image view, scale it, and offset it to get the desired rectangle of the original image.)
EDIT I added a little screencast just to show that my approach works as a proof of concept:
EDIT Also created a downloadable example project here:
https://github.com/mattneub/Programming-iOS-Book-Examples/blob/39cc800d18aa484d17c26ffcbab8bbe51c614573/bk2ch02p058cropImageView/Cropper/ViewController.swift
But note that I can't guarantee that URL will last forever, so please read the discussion above to understand the approach used.
Cropping UIImage not yielding expected crop - swift?
As requested, here's an example of using CIPerspectiveCorrection
to crop your image. I downloaded and used Sketch to get the approximate values of the 4 CGPoints
in your example image.
let inputBottomLeft = CIVector(x: 38, y: 122)
let inputTopLeft = CIVector(x: 68, y: 236)
let inputTopRight = CIVector(x: 146, y: 231)
let inputBottomRight = CIVector(x: 151, y: 96)
let filter = CIFilter(name: "CIPerspectiveCorrection")
filter?.setValue(inputTopLeft, forKey: "inputTopLeft")
filter?.setValue(inputTopRight, forKey: "inputTopRight")
filter?.setValue(inputBottomLeft, forKey: "inputBottomLeft")
filter?.setValue(inputBottomRight, forKey: "inputBottomRight")
filter?.setValue(ciOriginal, forKey: "inputImage")
let ciOutput = filter?.outputImage
Please note a few things:
- The most important thing to never forget about CoreImage is that the origin of a
CIImage
is bottom left, not top left. You need to "flip" the Y axis point. - A
UIImage
has a size, while aCIImage
has an extent. These are the same. (The only time it isn't is when using a CIFIlter to "create" something - a color, a tiled image - and then it's infinite.) - A
CIVector
can have quite a few properties. In this case I'm using the X/Y signature, and it's a straight copy of aCGPoint
except the Y axis is flipped. - I have a sample project here that uses a
UIImageView
. Be aware that performance on the simulator is nowhere near what performance is on a real device - I recommend using a device anytime CoreImage is involved. Also, I did a straight translation from the output CIImage to a UIImage. It usually better to use aCIContext
andCoreGraphics
if you are looking for performance.
Given your input, here's the output:
How to crop UIImage freely in swift?
Assuming you know how to display the UIimage
, how to draw the crop rectangle on top of the image, and how to allow the user to modify that crop rectangle, then the actual cropping could be done via CGImage
's cropping(to:)
function documented here:
https://developer.apple.com/documentation/coregraphics/cgimage/1454683-cropping
There's a nice bit of sample code in the documentation for cropping(to:)
at that link that implements a cropImage
function in Swift that uses cropping(to:)
.
Get the CGImage
from the UIImage
via the cgImage
property on UIImage
.
Related Topics
How to Update iOS 14 Widget Background Color from the App
How to Use Urlsession with Proxy in Swift 3
How to Work with Udp Sockets in iOS, Swift
Navigationlink Inside Lazyvgrid Cycles All Entries on Back, Swiftui
Create Hash in Swift Using Key and Message
Indexing into Array of Functions: Expression Resolves to an Unused L-Value
How to Pass a Swift Object to JavaScript (Wkwebview/Swift)
How to Make Watchkit Extension App and My iPhone App Share the Same Icloud Databases
I Want to Switch Time Scrubber to Do 15 Min Intervals
Backgroundtimeremaining Returns (35791394 Mins)
Running an iOS7 Project Under Xcode 6 - Image Assets Don't Show Up
Swift: Sort Array by Sort Descriptors
Updating Label Takes Too Long (Swift)
Expand Uitextview and Uitableview When Uitextview's Text Extends Beyond 1 Line
Uitapgesturerecognizer Called Immediately
Swift: How to Change Detailview Depending on Sidebar Selection State with Swiftui 4