Why Do I Get Rgb Values of (0,0,0) for an Nsimage with a Transparent Background

Why do I get RGB values of (0,0,0) for an NSImage with a transparent background?

You need to look at the alpha value of each pixel. If the alpha value is 0, the RGB value is irrelevant. (A fully transparent pixels's color is washed out to completely clear.)

Resizing (downscaled) NSImage slightly changes RGB values. How do I preserve the original RGB values?

The issue has been resolved thanks to @KenThomases. When resizing the NSImage I set the colorspace for the NSBitmapImageRep object to NSCalibratedRGBColorSpace. The original NSImage before the downscaling had a different colorspace name then the downscaled image. A simple change in color spaces produced the correct results. NSCalibratedRGBColorSpace was changed to NSDeviceRGBColorSpace.

Why does masked image pixel RGB values show original image RGB values in IOS image masking?

I find the reason in the link below:

CGImage Masking with UIImage pngData()

It seems we need to draw the masked image into Imagecontext. I tested the given piece of code and worked fine.

How to make one color transparent on a UIImage?

-(void)changeColor
{
UIImage *temp23=[UIImage imageNamed:@"leaf.png"];
CGImageRef ref1=[self createMask:temp23];
const float colorMasking[6] = {1.0, 2.0, 1.0, 1.0, 1.0, 1.0};
CGImageRef New=CGImageCreateWithMaskingColors(ref1, colorMasking);
UIImage *resultedimage=[UIImage imageWithCGImage:New];
}

-(CGImageRef)createMask:(UIImage*)temp
{
CGImageRef ref=temp.CGImage;
int mWidth=CGImageGetWidth(ref);
int mHeight=CGImageGetHeight(ref);
int count=mWidth*mHeight*4;
void *bufferdata=malloc(count);

CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;

CGContextRef cgctx = CGBitmapContextCreate (bufferdata,mWidth,mHeight, 8,mWidth*4, colorSpaceRef, kCGImageAlphaPremultipliedFirst);

CGRect rect = {0,0,mWidth,mHeight};
CGContextDrawImage(cgctx, rect, ref);
bufferdata = CGBitmapContextGetData (cgctx);

CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, bufferdata, mWidth*mHeight*4, NULL);
CGImageRef savedimageref = CGImageCreate(mWidth,mHeight, 8, 32, mWidth*4, colorSpaceRef, bitmapInfo,provider , NULL, NO, renderingIntent);
CFRelease(colorSpaceRef);
return savedimageref;
}

The above code is tested and I changed the green color to red color by using mask

How to resize NSImage?

Edit:
Since this answer is still the accepted answer, but was written without Retina screens in mind, I will straight up link to a better solution further down the thread: Objective-C Swift 4


Because the method of Paresh is totally correct but deprecated since 10.8 I'll post the working 10.8 code below. All credit to Paresh's answer though.

- (NSImage *)imageResize:(NSImage*)anImage newSize:(NSSize)newSize {
NSImage *sourceImage = anImage;
[sourceImage setScalesWhenResized:YES];

// Report an error if the source isn't a valid image
if (![sourceImage isValid]){
NSLog(@"Invalid Image");
} else {
NSImage *smallImage = [[NSImage alloc] initWithSize: newSize];
[smallImage lockFocus];
[sourceImage setSize: newSize];
[[NSGraphicsContext currentContext] setImageInterpolation:NSImageInterpolationHigh];
[sourceImage drawAtPoint:NSZeroPoint fromRect:CGRectMake(0, 0, newSize.width, newSize.height) operation:NSCompositingOperationCopy fraction:1.0];
[smallImage unlockFocus];
return smallImage;
}
return nil;
}


Related Topics



Leave a reply



Submit