Bool with 64-Bit on iOS

BOOL with 64-bit on iOS

@TimBodeit is right, but it doesn't explain why ...

BOOL b1=8960; //b1 == NO

... evaluates to NO on 32-bit iOS and why it evaluates to YES on 64-bit iOS. Let's start from the same beginning.

ObjC BOOL definition

#if (TARGET_OS_IPHONE && __LP64__)  ||  (__ARM_ARCH_7K__ >= 2)
#define OBJC_BOOL_IS_BOOL 1
typedef bool BOOL;
#else
#define OBJC_BOOL_IS_CHAR 1
typedef signed char BOOL;
// BOOL is explicitly signed so @encode(BOOL) == "c" rather than "C"
// even if -funsigned-char is used.
#endif

For 64-bit iOS or ARMv7k (watch) it's defined as bool and for the rest as signed char.

ObjC BOOL YES and NO

Read Objective-C Literals, where you can find:

Previously, the BOOL type was simply a typedef for signed char, and
YES and NO were macros that expand to (BOOL)1 and (BOOL)0
respectively. To support @YES and @NO expressions, these macros are
now defined using new language keywords in <objc/objc.h>:

#if __has_feature(objc_bool)
#define YES __objc_yes
#define NO __objc_no
#else
#define YES ((BOOL)1)
#define NO ((BOOL)0)
#endif

The compiler implicitly converts __objc_yes and __objc_no to (BOOL)1
and (BOOL)0. The keywords are used to disambiguate BOOL and integer
literals.

bool definition

bool is a macro defined in stdbool.h and it expands to _Bool, which is a boolean type introduced in C99. It can store two values, 0 or 1. Nothing else. To be more precise, stdbool.h defines four macros to use:

/* Don't define bool, true, and false in C++, except as a GNU extension. */
#ifndef __cplusplus
#define bool _Bool
#define true 1
#define false 0
#elif defined(__GNUC__) && !defined(__STRICT_ANSI__)
/* Define _Bool, bool, false, true as a GNU extension. */
#define _Bool bool
#define bool bool
#define false false
#define true true
#endif

#define __bool_true_false_are_defined 1

_Bool

_Bool was introduced in C99 and it can hold the values 0 or 1. What's important is:

When a value is demoted to a _Bool, the result is 0 if the value
equals 0, and 1 otherwise.

Now we know where this mess comes from and we can better understand what's going on.

64-bit iOS || ARMv7k

BOOL -> bool -> _Bool (values 0 or 1)

Demoting 8960 to _Bool gives 1, because the value doesn't equal 0. See (_Bool section).

32-bit iOS

BOOL -> signed char (values -128 to 127).

If you're going to store int values (-128 to 127) as signed char, the value is unchanged per C99 6.3.1.3. Otherwise it is implementation defined (C99 quote):

Otherwise, the new type is signed and the value cannot be represented
in it; either the result is implementation-defined or an
implementation-defined signal is raised.

It means that clang can decide. To make it short, with the default settings, clang wraps it around (int -> signed char):

  • -129 becomes 127,
  • -130 becomes 126,
  • -131 becomes 125,
  • ...

And in the opposite direction:

  • 128 becomes -128,
  • 129 becomes -127,
  • 130 becomes -126,
  • ...

But because signed char can store values in the range -128 to 127, it can store 0 as well. For example 256 (int) becomes 0 (signed char). And when your value 8960 is wrapped around ...

  • 8960 becomes 0,
  • 8961 becomes 1,
  • 8959 becomes -1,
  • ...

... it becomes 0 when stored in signed char (8960 is a multiple of 256, 8960 % 256 == 0), thus it's NO. The same applies to 256, 512, ... multiples of 256.

I strongly recommend using YES, NO with BOOL and not relying on fancy C features like int as a condition in if, etc. That's the reason Swift has Bool, true, and false and you can't use Int values in conditions where Bool is expected. Just to avoid this mess ...

BOOL value in 64-Bit

My first thought, if odendi is supposed to represent a BOOL, then why not use it as one?

if ([odeme.odendi boolValue]) {
odeme.odendi = @NO;
} else {
odeme.odendi = @YES;
}

Or even eliminate the if-else structure.

odeme.odendi = [NSNumber numberWithBool:![odeme.odendi boolValue]];

Or as Martin suggests some different syntax for the same statement:

odeme.odendi = @(![odeme.odendi boolValue]);

Don't miss that exclamation point.


Given that this seems to work on 32-bit and not on 64-bit, I suspect this might be the problem:

if (odeme.odendi == [NSNumber numberWithInt:1])

Which is why you should have put a log in each branch instead of just below it all. It would let you know that your if is evaluating incorrectly and the problem is here, rather than inside either branch.

You could try [NSNumber numberWithInteger:1], which is different between 32-bit and 64-bit. I have no idea if this will actually work or not. At the end of the day though, you say odeme.odendi is a BOOL. It should be used as such. Don't compare it to a number. Don't even compare it to an [NSNumber numberWithBOOL:YES] or @YES. Just grab the BOOL value. It will evaluate to YES or NO and the correct branch should be chosen.

RestKit does not map BOOL in 64 bit simulator

As Mikael says, there is some glitch in RestKit running on 64 bit platform preventing the default conversion from NSNumber to BOOL (and viceversa).

There is, though, a way to make this work thanks to RestKit Value Transformers modular architecture. So, you just create a dedicated transformer class and register it with RestKit default transformer.

This is how the transformer could look like:

@interface RKCustomBOOLTransformer : NSObject  <RKValueTransforming>

+ (instancetype)defaultTransformer;

@end

@implementation RKCustomBOOLTransformer

+ (instancetype)defaultTransformer {
return [RKCustomBOOLTransformer new];
}

- (BOOL)validateTransformationFromClass:(Class)inputValueClass toClass:(Class)outputValueClass {
return ([inputValueClass isSubclassOfClass:[NSNumber class]] &&
[outputValueClass isSubclassOfClass:[NSNumber class]]);
}

- (BOOL)transformValue:(id)inputValue toValue:(id *)outputValue ofClass:(Class)outputValueClass error:(NSError **)error {
RKValueTransformerTestInputValueIsKindOfClass(inputValue, (@[ [NSNumber class] ]), error);
RKValueTransformerTestOutputValueClassIsSubclassOfClass(outputValueClass, (@[ [NSNumber class] ]), error);

if (strcmp([inputValue objCType], @encode(BOOL)) == 0) {
*outputValue = inputValue?@(1):@(0);
} else if (strcmp([inputValue objCType], @encode(int)) == 0) {
*outputValue = ([inputValue intValue] == 0) ? @(NO) : @(YES);
}
return YES;
}

@end

You register it like this:

[[RKValueTransformer defaultValueTransformer]
insertValueTransformer:[RKCustomBOOLTransformer defaultTransformer] atIndex:0];

Take care of doing the registration before defining your mappings.

Assigning NSUInteger to BOOL conceptual understanding

However, I could not understand the iOS 6 behaviour for enabling the button for odd ones only. [...] I was curious to know the root cause.

To explain the observed behavior there are two technical details to explain: The type of Objective-C's BOOL and implementation details of UIKit.

Type of BOOL

The differing behavior is actually not related to the iOS version but to the device's architecture. There's a difference on how the iOS SDK defines a BOOL for 32 versus 64 bit architectures. See this excerpt from objc.h:

#if !defined(OBJC_HIDE_64) && TARGET_OS_IPHONE && __LP64__
typedef bool BOOL;
#else
typedef signed char BOOL;
#endif

So BOOL on 32 bit can have any value between 0 and 255, while on 64 bit it is compiler enforced to only have the values 0 or 1. You can easily try this by running the following line on the simulator, set to iPhone 4s (32 bit) or iPhone 6 (64 bit).

NSLog(@"%d", (BOOL)2);

This is the explanation for why you see differing behavior on different devices. But where does the even-odd thing come from?

Implementation details of UIBarButtonItem

There's another subtle technical detail involved. You are actually setting the enabled property of a UIBarButtonItem.

Apple likes to use a space saving scheme to store flags in their UI components. A BOOL (on both 32 and 64 bit architectures) would always use at least one byte, but only one bit of information is needed. So they are using bit fields to store the actual value.

Excerpt from iOS SDK 8.4 UIBarButtonItem.h (shortened for clarity):

@interface UIBarButtonItem : UIBarItem {
struct {
unsigned int enabled:1;
} _barButtonItemFlags;
}

The magic is the :1 behind the enabled field in the _barButtonItemFlags struct. This defines the bit-field of width one. To connect the enabled property with this bit-field, they have to implement custom accessors. Here's an example for the setter:

- (void)setEnabled:(BOOL)enabled {
_barButtonItemFlags.enabled = enabled;
}

So what happens when we're doing this:

someBarButtonItem.enabled = 2;

The compiler understands that it has to call the setEnabled: method with an argument of 2.

On 64 bit architectures the 2 would be converted to _Bool, where the standard says this has to result in a value of 1. On 32 bit systems this is not the case, leaving the original value of 2 in place, before calling the method.

Inside of setEnabled: the value is assigned to an unsigned int of width 1. The C standard says that when assigning an unsigned integer of greater width the remaining bits are just dropped. The result is that setEnabled: just stores the lowest bit of the argument in _barButtonItemFlags.enabled. The lowest bit is one on odd numbers, zero on even numbers.

Conclusion

All of the above behavior is within the semantics provided by the standard. There is no undefined behavior involved. It's just an unfortunate fact that the expected behavior of BOOL differs from what you actually get on 32 bit architectures.

BOOL property KVC: is this behavior a bug?

Yes, that's the cause.

Yes, that is intended behavior. (Well, the way it serializes to JSON on 32-bit isn't particularly "intended" but it is expected. The fact that 64-bit uses a proper Bool type is intended.)

The JSON serializer has no way to tell the difference between a one-byte signed integer and a boolean on 32-bit, because they are in fact the same thing.

Is there any difference between bool, Boolean, and BOOL in Objective-C?

Boolean is an old Carbon keyword (historic Mac type), defined as an unsigned char. BOOL is an Objective-C type defined as signed char. bool is a defined version of the _Bool standard C type. It's defined as an int. Use BOOL.

Edit (2019): Apple talks about the underlying implementation of BOOL in some new documentation. Basically, on macOS, BOOL is still ultimately a signed char, but on iOS and related platforms, it a native C bool underneath.

Xcode compile error on bool when device not connected

BOOL is a different type, depending on whether you compile for 32 bit or for 64 bit. There are different types used like Bool, bool, Boolean plus probably others - make sure that you use the same type everywhere.

Plugging in your device means that code will be compiled for your device, and not for the simulator, so this can change between 32 and 64 bit and trigger the problem.

I'd also check if there is a typedef or #define for BOOL somewhere in your code. Double-click on BOOL in your code, right-click and "Show definition".

BOOL property from a calculation returns NSNumber with incorect value using valueForKey:

valueForKey always returns an Objective-C object, even if the property has scalar type.

From the documentation (emphasis mine):

The default implementations of valueForKey: and setValue:forKey:
provide support for automatic object wrapping of the non-object data
types, both scalars and structs.

Once valueForKey: has determined the specific accessor method or
instance variable that is used to supply the value for the specified
key, it examines the return type or the data type. If the value to be
returned is not an object, an NSNumber or NSValue object is created
for that value and returned in its place.

The return value of your method is BOOL, which is defined as

typedef signed char BOOL;

on OS X and on the 32-bit iOS platform. So what valueForKey returns is a NSNumber
containing the result of

signed char val = [self.flags integerValue] & SomeConstantFlag;

and that can be in the range -128 .. 127.

To ensure that you get only YES or NO (aka 1 or 0) write your custom getter as:

-(BOOL)someConstantFlag
{
return ([self.flags integerValue] & SomeConstantFlag) != 0;
}

Remark: On the 64-bit iOS platform (but not on 64-bit OS X), BOOL is defined as the C99 _Bool, which is a "proper" boolean type and can take only the value 0 or 1.

Different values of BOOL variable in debug and release version

Actually I found the solution for that. The Bool local variable always initialised as garbage value if we do not provide any which was creating the problem in my case. When I initialised BOOL addNotification = NO; it works fine.

Found the answer here.
Default value of BOOL

Thanks All.

JSONModel incorrectly converting 'T' to '0' on 32-bit devices

In the project that you have linked, the BOOLFromNSString method is as follows:

-(NSNumber*)BOOLFromNSString:(NSString*)string
{
if (string != nil &&
([string caseInsensitiveCompare:@"true"] == NSOrderedSame ||
[string caseInsensitiveCompare:@"yes"] == NSOrderedSame)) {
return [NSNumber numberWithBool:YES];
}
return [NSNumber numberWithBool: ([string intValue]==0)?NO:YES];
}

This means that it is expected to return YES for the following case-insensitive values: true, yes, [any number that isn't 0].

The fact that it returns YES for T on any platform is magic, not "correct". You should use one of the expected values.


Edit: Your subclass:

#import "JSONModelTransformations/JSONValueTransformer.h"

@interface MyParser : JSONValueTransformer
@end

@implementation MyParser
- (NSNumber *)BOOLFromNSString:(NSString *)string {
if (string != nil && [string caseInsensitiveCompare:@"t"] == NSOrderedSame) {
return [NSNumber numberWithBool:YES];
}
return [super BOOLFromNSString:string];
}
@end


Related Topics



Leave a reply



Submit