Convert emoji to hex value using Swift
This is a "pure Swift" method, without using Foundation:
let smiley = "br>
let uni = smiley.unicodeScalars // Unicode scalar values of the string
let unicode = uni[uni.startIndex].value // First element as an UInt32
print(String(unicode, radix: 16, uppercase: true))
// Output: 1F60A
Note that a Swift Character
represents a "Unicode grapheme cluster"
(compare Strings in Swift 2 from the Swift blog) which can
consist of several "Unicode scalar values". Taking the example
from @TomSawyer's comment below:
let zero = "0️⃣"
let uni = zero.unicodeScalars // Unicode scalar values of the string
let unicodes = uni.map { $0.value }
print(unicodes.map { String($0, radix: 16, uppercase: true) } )
// Output: ["30", "FE0F", "20E3"]
Emoji to Hex value in ios
Is this what you are looking for?
NSString *smiley = @";
NSData *data = [smiley dataUsingEncoding:NSUTF32LittleEndianStringEncoding];
uint32_t unicode;
[data getBytes:&unicode length:sizeof(unicode)];
NSLog(@"%x", unicode);
// Output: 1f604
Reverse direction:
uint32_t unicode = 0x1f604;
NSString *smiley = [[NSString alloc] initWithBytes:&unicode length:sizeof(unicode) encoding:NSUTF32LittleEndianStringEncoding];
NSLog(@"%@", smiley);
// Output: br>
Remark: Both code examples assume that integers are stored in little-endian byte order (which is the case for all current platforms running OS X or iOS).
Convert emoji unicode to hex codepoint
- (NSArray<NSNumber*>*) unicodeCodePoints:(NSString*)unicodeChar
{
NSMutableArray* codePoints = [[NSMutableArray alloc] init];
NSData* data = [unicodeChar dataUsingEncoding:NSUTF32LittleEndianStringEncoding];
for ( NSUInteger i = 0; i < data.length / sizeof(UInt32); i++ )
{
UInt32* arr = (UInt32*)(data.bytes);
[codePoints addObject:@(arr[i])];
}
return codePoints;
}
Then you could call it like this:
for ( NSNumber* num in [self unicodeCodePoints:@"♀️"] )
{
NSLog(@"%0*x", (int)(2*sizeof(UInt32)), (UInt32)[num unsignedIntegerValue]);
}
Please note this assumes a single unicode character is represented by the NSString argument.
Converting emoji from hex code to unicode
It seems that your question is really one of "how do I display a character, knowing its code point?"
This question turns out to be rather language-dependent! Modern languages have little trouble with this. In Swift, we do this:
$ swift
Welcome to Apple Swift version 3.0.2 (swiftlang-800.0.63 clang-800.0.42.1). Type :help for assistance.
1> "\u{1f600}"
$R0: String = "br>
In JavaScript, it is the same:
$ node
> "\u{1f600}"
''
In Java, you have to do a little more work. If you want to use the code point directly you can say:
new StringBuilder().appendCodePoint(0x1f600).toString();
The sequence "\uD83D\uDE00"
also works in all three languages. This is because those "characters" are actually what Unicode calls surrogates and when they are combined together a certain way they stand for a single character. The details of how this all works can be found on the web in many places (look for UTF-16 encoding). The algorithm is there. In a nutshell you take the code point, subtract 10000 hex, and spread out the 20 bits of that difference like this: 110110xxxxxxxxxx110111xxxxxxxxxx
.
But rather than worrying about this translation, you should use the code point directly if your language supports it well. You might also be able to copy-paste the emoji character into a good text editor (make sure the encoding is set to UTF-8). If you need to use the surrogates, your best best is to look up a Unicode chart that shows you something called the "UTF-16 encoding."
How can get unicode of an emoji in ios swift
use Swift Unicode escape sequence concept:
let emojiString = "\u{1F4C4}"
and if you want to get all emoji's Unicode then try this
let emojiRanges = [
0x1F601...0x1F64F,
0x2702...0x27B0,
0x1F680...0x1F6C0,
0x1F170...0x1F251
]
for range in emojiRanges {
for i in range {
var c = String(UnicodeScalar(i))
print(c)
}
}
Convert Unicode (e.g. 1f564) to Emoji in Swift
This works. I don't know if there is anything more direct:
let myStr = "1f564"
let str = String(UnicodeScalar(Int(myStr, radix: 16)!)!)
Translating unicode representation of an emoji to an actual emoji
The "1F44D" in this table is the unicode value in hex. Convert this to an integer, that to a UnicodeScalar, and that to a String or Character:
let unified = "1F44D"
if let value = Int(unified, radix: 16),
let scalar = UnicodeScalar(value) {
let string = String(scalar)
print(string)
}
Convert unicode to emoji
You can use arbitrary Unicode characters directly in your source code
let string = "br>
or use the Swift Unicode escape sequence:
let string = "\u{1F4C4}"
More information in the section about "String Literals" in the Swift reference.
Related Topics
How to Succinctly Get the First 5 Characters of a String in Swift
Swift Protocol Generic as Function Return Type
Differences Between "Static Var" and "Var" in Swift
Getting Path for Resource in Command Line Tool
Pause an Skaction in Spritekit with Swift
How to Take Screen Shot Programmatically (Swift, Spritekit)
What Is an Example of Drawing Custom Nodes with Vertices in Swift Scenekit
How to Set a Specific Default Time for a Date Picker in Swift
Swift Mkmapview Drop a Pin Annotation to Current Location
Swift 3:Fatal Error: Double Value Cannot Be Converted to Int Because It Is Either Infinite or Nan
Skip Item When Performing Map in Swift
Swift 3, Xcode 8 Instantiate View Controller Is Not Working
Early Return from a Void-Func in Swift
How to Add a Double Tap Gesture Recognizer in Swift