Get Image from Calayer or Nsview (Swift 3)

Get Image from CALayer or NSView (swift 3)

Here are some NSView options:

extension NSView {

/// Get `NSImage` representation of the view.
///
/// - Returns: `NSImage` of view

func image() -> NSImage {
let imageRepresentation = bitmapImageRepForCachingDisplay(in: bounds)!
cacheDisplay(in: bounds, to: imageRepresentation)
return NSImage(cgImage: imageRepresentation.cgImage!, size: bounds.size)
}
}

Or

extension NSView {

/// Get `Data` representation of the view.
///
/// - Parameters:
/// - fileType: The format of file. Defaults to PNG.
/// - properties: A dictionary that contains key-value pairs specifying image properties.
/// - Returns: `Data` for image.

func data(using fileType: NSBitmapImageRep.FileType = .png, properties: [NSBitmapImageRep.PropertyKey: Any] = [:]) -> Data {
let imageRepresentation = bitmapImageRepForCachingDisplay(in: bounds)!
cacheDisplay(in: bounds, to: imageRepresentation)
return imageRepresentation.representation(using: fileType, properties: properties)!
}
}

Some CALayer options:

extension CALayer {

/// Get `NSImage` representation of the layer.
///
/// - Returns: `NSImage` of the layer.

func image() -> NSImage {
let width = Int(bounds.width * contentsScale)
let height = Int(bounds.height * contentsScale)
let imageRepresentation = NSBitmapImageRep(bitmapDataPlanes: nil, pixelsWide: width, pixelsHigh: height, bitsPerSample: 8, samplesPerPixel: 4, hasAlpha: true, isPlanar: false, colorSpaceName: .deviceRGB, bytesPerRow: 0, bitsPerPixel: 0)!
imageRepresentation.size = bounds.size

let context = NSGraphicsContext(bitmapImageRep: imageRepresentation)!

render(in: context.cgContext)

return NSImage(cgImage: imageRepresentation.cgImage!, size: bounds.size)
}
}

Or

extension CALayer {

/// Get `Data` representation of the layer.
///
/// - Parameters:
/// - fileType: The format of file. Defaults to PNG.
/// - properties: A dictionary that contains key-value pairs specifying image properties.
///
/// - Returns: `Data` for image.

func data(using fileType: NSBitmapImageRep.FileType = .png, properties: [NSBitmapImageRep.PropertyKey: Any] = [:]) -> Data {
let width = Int(bounds.width * contentsScale)
let height = Int(bounds.height * contentsScale)
let imageRepresentation = NSBitmapImageRep(bitmapDataPlanes: nil, pixelsWide: width, pixelsHigh: height, bitsPerSample: 8, samplesPerPixel: 4, hasAlpha: true, isPlanar: false, colorSpaceName: .deviceRGB, bytesPerRow: 0, bitsPerPixel: 0)!
imageRepresentation.size = bounds.size

let context = NSGraphicsContext(bitmapImageRep: imageRepresentation)!

render(in: context.cgContext)

return imageRepresentation.representation(using: fileType, properties: properties)!
}
}

WKWebView CALayer to image exports blank image

Latest Update:

Now you can take screenshot of WKWebView just like WebView.

Apple added new method for both iOS and macOS,

func takeSnapshot(with snapshotConfiguration: WKSnapshotConfiguration?, 
completionHandler: @escaping (NSImage?, Error?) -> Void)

But its still in beta.


You can't take a screenshot of WKWebView. It always returns a blank image. Even if you try to include WKWebView inside another NSView and take a screenshot, it will give you blank image in place of WKWebView.

You should use WebView instead of WKWebView for your purpose. Check this question.

If you are ok with using private frameworks(apple doesn't allow your app in its store), check this GitHub. Its written in Obj-C. I don't know Obj-C so I can't explain what's happening in that code. But it claims to do the work.

Your best approach is to use WebView and use your mentioned extension data() on the WebView.

Just a question: Why don't you use phantomJS?

PS. Sorry for the late reply. I didn't see your e-mail.

Rendering NSView containing some CALayers to an NSImage

The only way I found to do this is to use the CGWindow API's, something like:

CGImageRef cgimg = CGWindowListCreateImage(CGRectZero, kCGWindowListOptionIncludingWindow, [theWindow windowNumber], kCGWindowImageDefault);

then clip out the part of that CGImage that corresponds to your view with

-imageByCroppingToRect.

Then make a NSImage from the cropped CGImage.
Be aware this won't work well if parts of that window are offscreen.

How should I synchronize the rendered content of two NSViews?

Probably you're looking for NSViews displayRectIgnoringOpacity:inContext: method. It may not be the most efficient method, since it draws the view twice, but it seems to work in your case. On the other hand I'm not sure if it'd be faster to cache the pixel buffer, which requires more memory for sure anyways.

How do I take a screenshot of an NSView?

[[NSImage alloc] initWithData:[view dataWithPDFInsideRect:[view bounds]]];


Related Topics



Leave a reply



Submit