Resulting MTLTexture lighter than CGImage
Problem was solved Thanks 0xBFE1A8, by adding gamma correction
by replacing
outTextue.write(color, pid);
with:
outTextue.write(float4(pow(color.rgb, float3(2,2,2)), color.a), pid);
Memory leak when making CGImage from MTLTexture (Swift, macOS)
I think you've misunderstood the issue with CGDataProviderReleaseDataCallback
and CGDataProviderRelease()
being unavailable.
CGDataProviderRelease()
is (in C) used to release the CGDataProvider
object itself. But that's not the same thing as the byte buffer that you've provided to the CGDataProvider
when you created it.
In Swift, the lifetime of the CGDataProvider
object is managed for you, but that doesn't help deallocate the byte buffer.
Ideally, CGDataProvider
would be able to automatically manage the lifetime of the byte buffer, but it can't. CGDataProvider
doesn't know how to release that byte buffer because it doesn't know how it was allocated. That's why you have to provide a callback that it can use to release it. You are essentially providing the knowledge of how to release the byte buffer.
Since you're using malloc()
to allocate the byte buffer, your callback needs to free()
it.
That said, you'd be much better off using CFMutableData
rather than UnsafeMutableRawPointer
. Then, create the data provider using CGDataProvider(data:)
. In this case, all of the memory is managed for you.
CGImage to MPSTexture or MPSImage
Why not construct the MTLTexture from the CVPixelBuffer directly? Is much quicker!
Do this once at the beginning of your program:
// declare this somewhere, so we can re-use it
var textureCache: CVMetalTextureCache?
// create the texture cache object
guard CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, device, nil, &textureCache) == kCVReturnSuccess else {
print("Error: could not create a texture cache")
return false
}
Do this once your have your CVPixelBuffer:
let width = CVPixelBufferGetWidth(pixelBuffer)
let height = CVPixelBufferGetHeight(pixelBuffer)
var texture: CVMetalTexture?
CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache,
pixelBuffer, nil, .bgra8Unorm, width, height, 0, &texture)
if let texture = texture {
metalTexture = CVMetalTextureGetTexture(texture)
}
Now metalTexture
contains an MTLTexture object with the contents of the CVPixelBuffer.
Crash with passing about a MTLTexture
I'm not sure you're aware that it's both acceptable and common to create your own texture and render to that. You don't have to render to the texture of a drawable of a view or layer.
Create your own texture(s) to render to and then, for just the present step, render from your texture to a drawable's texture.
Mind you, depending on exactly what you're doing, you may want a pool of three or so textures that you rotate through. The issue you need to be concerned with is whether Metal is still reading from a texture, so you don't write to it before it's done being read.
Related Topics
How to Cast Up to Super Class When There Is an Override Function in the Sub Class
Swiftui Onmovecommand Actions Aren't Executed
Animate Path Stroke Drawing in Swiftui
Make Nstextfield in Nstablecellview Firstresponder()
Menuapp in Swift 4 to Run on Login for High Sierra
Swift - Encoding and Decoding String for Special Characters
How to Retrieve All Available Finder Tags
Check If a Character Is Lowercase or Uppercase
Firebasecore Lexical or Preprocessor Issue
Uibezierpath Appending Overlapping Isn't Filled
Timer Not Firing Every Second on Watchkit
Why Is Icloud Account/Ckcontainer Not Being Found
Get Current Url from Browser in MACos
Swift Any Difference Between Closures and First-Class Functions