Hex String to Character in PURE Swift
A couple of things about your code:
var charArray = [Character]()
charArray = map(hexArray) { charArray.append(Character($0)) }
You don't need to create an array and then assign the result of the map, you can just assign the result and avoid creating an unnecessary array.
charArray = map(hexArray) { charArray.append(Character($0)) }
Here you can use hexArray.map
instead of map(hexArray)
, also when you use a map function what you are conceptually doing is mapping the elements of the receiver array to a new set of values and the result of the mapping is the new "mapped" array, which means that you don't need to do charArray.append
inside the map closure.
Anyway, here is a working example:
let hexArray = ["2F", "24", "40", "2A"]
var charArray = hexArray.map { char -> Character in
let code = Int(strtoul(char, nil, 16))
return Character(UnicodeScalar(code))
}
println(charArray) // -> [/, $, @, *]
EDIT: This is another implementation that doesn't need Foundation:
func hexToScalar(char: String) -> UnicodeScalar {
var total = 0
for scalar in char.uppercaseString.unicodeScalars {
if !(scalar >= "A" && scalar <= "F" || scalar >= "0" && scalar <= "9") {
assertionFailure("Input is wrong")
}
if scalar >= "A" {
total = 16 * total + 10 + scalar.value - 65 /* 'A' */
} else {
total = 16 * total + scalar.value - 48 /* '0' */
}
}
return UnicodeScalar(total)
}
let hexArray = ["2F", "24", "40", "2A"]
var charArray = hexArray.map { Character(hexToScalar($0)) }
println(charArray)
EDIT2 Yet another option:
func hexToScalar(char: String) -> UnicodeScalar {
let map = [ "0": 0, "1": 1, "2": 2, "3": 3, "4": 4, "5": 5, "6": 6, "7": 7, "8": 8, "9": 9,
"A": 10, "B": 11, "C": 12, "D": 13, "E": 14, "F": 15 ]
let total = reduce(char.uppercaseString.unicodeScalars, 0, { $0 * 16 + (map[String($1)] ?? 0xff) })
if total > 0xFF {
assertionFailure("Input char was wrong")
}
return UnicodeScalar(total)
}
Final edit: explanation
Given that the ascii table has all the number together (012345679), we can convert 'N' (base 10) to an integer knowing the ascii value of 0.
Because:
'0': 48
'1': 49
...
'9': 57
Then if for example you need to convert '9' to 9 you could do
asciiValue('9') - asciiValue('0') => 57 - 48 = 9
And you can do the same from 'A' to 'F':
'A': 65
'B': 66
...
'F': 70
Now we can do the same as before but, for example for 'F' we'd do:
asciiValue('F') - asciiValue('A') => 70 - 65 = 5
Note that we need to add 10 to this number to get the decimal. Then (going back to the code): If the scalar is between A-Z we need to do:
10 + asciiValue(<letter>) - asciiValue('A')
which is the same as: 10 + scalar.value - 65
And if it's between 0-9:
asciiValue(<letter>) - asciiValue('0')
which is the same as: scalar.value - 48
For example: '2F'
'2' is 2 and 'F' is 15 (by the previous example), right?. Since hex is base 16 we'd need to do:
((16 ^ 1) * 2) + ((16 ^ 0) * 15) = 47
Given a hexadecimal string in Swift, convert to hex value
A possible solution:
let string = "D7C17A4F"
let chars = Array(string)
let numbers = map (stride(from: 0, to: chars.count, by: 2)) {
strtoul(String(chars[$0 ..< $0+2]), nil, 16)
}
Using the approach from https://stackoverflow.com/a/29306523/1187415,
the string is split into substrings of two characters.
Each substring is interpreted as a sequence of digits
in base 16, and converted to a number with strtoul()
.
Verify the result:
println(numbers)
// [215, 193, 122, 79]
println(map(numbers, { String(format: "%02X", $0) } ))
// [D7, C1, 7A, 4F]
Update for Swift 2 (Xcode 7):
let string = "D7C17A4F"
let chars = Array(string.characters)
let numbers = 0.stride(to: chars.count, by: 2).map {
UInt8(String(chars[$0 ..< $0+2]), radix: 16) ?? 0
}
print(numbers)
or
let string = "D7C17A4F"
var numbers = [UInt8]()
var from = string.startIndex
while from != string.endIndex {
let to = from.advancedBy(2, limit: string.endIndex)
numbers.append(UInt8(string[from ..< to], radix: 16) ?? 0)
from = to
}
print(numbers)
The second solution looks a bit more complicated but has the small
advantage that no additional chars
array is needed.
How to convert Data to hex string in swift
A simple implementation (taken from How to hash NSString with SHA1 in Swift?, with an additional option for uppercase output) would be
extension Data {
struct HexEncodingOptions: OptionSet {
let rawValue: Int
static let upperCase = HexEncodingOptions(rawValue: 1 << 0)
}
func hexEncodedString(options: HexEncodingOptions = []) -> String {
let format = options.contains(.upperCase) ? "%02hhX" : "%02hhx"
return self.map { String(format: format, $0) }.joined()
}
}
I chose a hexEncodedString(options:)
method in the style of the existing method base64EncodedString(options:)
.
Data
conforms to the Collection
protocol, therefore one can usemap()
to map each byte to the corresponding hex string.
The %02x
format prints the argument in base 16, filled up to two digits
with a leading zero if necessary. The hh
modifier causes the argument
(which is passed as an integer on the stack) to be treated as a one byte
quantity. One could omit the modifier here because $0
is an unsigned
number (UInt8
) and no sign-extension will occur, but it does no harm leaving
it in.
The result is then joined to a single string.
Example:
let data = Data([0, 1, 127, 128, 255])
// For Swift < 4.2 use:
// let data = Data(bytes: [0, 1, 127, 128, 255])
print(data.hexEncodedString()) // 00017f80ff
print(data.hexEncodedString(options: .upperCase)) // 00017F80FF
The following implementation is faster by a factor about 50
(tested with 1000 random bytes). It is inspired to
RenniePet's solution
and Nick Moore's solution, but takes advantage ofString(unsafeUninitializedCapacity:initializingUTF8With:)
which was introduced with Swift 5.3/Xcode 12 and is available on macOS 11 and iOS 14 or newer.
This method allows to create a Swift string from UTF-8 units efficiently, without unnecessary copying or reallocations.
An alternative implementation for older macOS/iOS versions is also provided.
extension Data {
struct HexEncodingOptions: OptionSet {
let rawValue: Int
static let upperCase = HexEncodingOptions(rawValue: 1 << 0)
}
func hexEncodedString(options: HexEncodingOptions = []) -> String {
let hexDigits = options.contains(.upperCase) ? "0123456789ABCDEF" : "0123456789abcdef"
if #available(macOS 11.0, iOS 14.0, watchOS 7.0, tvOS 14.0, *) {
let utf8Digits = Array(hexDigits.utf8)
return String(unsafeUninitializedCapacity: 2 * self.count) { (ptr) -> Int in
var p = ptr.baseAddress!
for byte in self {
p[0] = utf8Digits[Int(byte / 16)]
p[1] = utf8Digits[Int(byte % 16)]
p += 2
}
return 2 * self.count
}
} else {
let utf16Digits = Array(hexDigits.utf16)
var chars: [unichar] = []
chars.reserveCapacity(2 * self.count)
for byte in self {
chars.append(utf16Digits[Int(byte / 16)])
chars.append(utf16Digits[Int(byte % 16)])
}
return String(utf16CodeUnits: chars, count: chars.count)
}
}
}
How to convert hex number to bin in Swift?
You can use NSScanner()
from the Foundation framework:
let scanner = NSScanner(string: str)
var result : UInt32 = 0
if scanner.scanHexInt(&result) {
println(result) // 37331519
}
Or the BSD library function strtoul()
let num = strtoul(str, nil, 16)
println(num) // 37331519
As of Swift 2 (Xcode 7), all integer types have an
public init?(_ text: String, radix: Int = default)
initializer, so that a pure Swift solution is available:
let str = "239A23F"
let num = Int(str, radix: 16)
How to express Strings in Swift using Unicode hexadecimal values (UTF-16)
Character
The Swift syntax for forming a hexadecimal code point is
\u{n}
where n is a hexadecimal number up to 8 digits long. The valid range for a Unicode scalar is U+0 to U+D7FF and U+E000 to U+10FFFF inclusive. (The U+D800 to U+DFFF range is for surrogate pairs, which are not scalars themselves, but are used in UTF-16 for encoding the higher value scalars.)
Examples:
// The following forms are equivalent. They all produce "C".
let char1: Character = "\u{43}"
let char2: Character = "\u{0043}"
let char3: Character = "\u{00000043}"
// Higher value Unicode scalars are done similarly
let char4: Character = "\u{203C}" // ‼ (DOUBLE EXCLAMATION MARK character)
let char5: Character = "\u{1F431}" // (cat emoji)
// Characters can be made up of multiple scalars
let char7: Character = "\u{65}\u{301}" // é = "e" + accent mark
let char8: Character = "\u{65}\u{301}\u{20DD}" // é⃝ = "e" + accent mark + circle
Notes:
- Leading zeros can be added or omitted
- Characters are known as extended grapheme clusters. Even when they are composed of multiple scalars, they are still considered a single character. What is key is that they appear to be a single character (grapheme) to the user.
- TODO: How to convert surrogate pair to Unicode scalar in Swift
String
Strings are composed of characters. See the following examples for some ways to form them using hexadecimal code points.
Examples:
var string1 = "\u{0043}\u{0061}\u{0074}\u{203C}\u{1F431}" // Cat‼br>
// pass an array of characters to a String initializer
let catCharacters: [Character] = ["\u{0043}", "\u{0061}", "\u{0074}", "\u{203C}", "\u{1F431}"] // ["C", "a", "t", "‼", "quot;]
let string2 = String(catCharacters) // Cat‼br>
Converting Hex Values at Runtime
At runtime you can convert hexadecimal or Int
values into a Character
or String
by first converting it to a UnicodeScalar
.
Examples:
// hex values
let value0: UInt8 = 0x43 // 67
let value1: UInt16 = 0x203C // 8252
let value2: UInt32 = 0x1F431 // 128049
// convert hex to UnicodeScalar
let scalar0 = UnicodeScalar(value0)
// make sure that UInt16 and UInt32 form valid Unicode values
guard
let scalar1 = UnicodeScalar(value1),
let scalar2 = UnicodeScalar(value2) else {
return
}
// convert to Character
let character0 = Character(scalar0) // C
let character1 = Character(scalar1) // ‼
let character2 = Character(scalar2) // br>
// convert to String
let string0 = String(scalar0) // C
let string1 = String(scalar1) // ‼
let string2 = String(scalar2) // br>
// convert hex array to String
let myHexArray = [0x43, 0x61, 0x74, 0x203C, 0x1F431] // an Int array
var myString = ""
for hexValue in myHexArray {
if let scalar = UnicodeScalar(hexValue) {
myString.append(Character(scalar))
}
}
print(myString) // Cat‼br>
Further reading
- Strings and Characters docs
- Glossary of Unicode Terms
- Strings in Swift
- Working with Unicode code points in Swift
Defining a custom PURE Swift Character Set
You can already test for membership in a character set by initializing a String and using the contains
global function:
let vowels = "aeiou"
let isVowel = contains(vowels, "i") // isVowel == true
As far as your encode
and decode
functions go, are you just trying to get the 8-bit or 16-bit encodings for the Character? If that is the case then just convert them to a String and access there utf8
or utf16
properties:
let char = Character("c")
let a = Array(String(char).utf8)
println() // This prints [99]
Decode
would take a little more work, but I know there's a function for it...
Edit: This will replace a character from a characterSet with '%' followed by the character's hex value:
let encode: String -> String = { s in
reduce(String(s).unicodeScalars, "") { x, y in
switch contains(charSet, Character(y)) {
case true:
return x + "%" + String(y.value, radix: 16)
default:
return x + String(y)
}
}
}
let badURL = "http://why won't this work.com"
let encoded = encode(badURL)
println(encoded) // prints "http://why%20won%27t%20this%20work.com"
Decoding, again, is a bit more challenging, but I'm sure it can be done...
How can I convert RGB hex value into UIColor in Swift?
You just need to change your hex parameter type from Any to unsigned integer. Btw no need to convert the values to Float.
func hexColor(_ hex: UInt) -> UIColor {
UIColor(red: .init((hex & 0xff0000) >> 16) / 255,
green: .init((hex & 0xff00) >> 8) / 255,
blue: .init( hex & 0xff) / 255,
alpha: 1)
}
hexColor(0x24B491) // r 0.141 g 0.706 b 0.569 a 1.0
Convert emoji to hex value using Swift
This is a "pure Swift" method, without using Foundation:
let smiley = "br>
let uni = smiley.unicodeScalars // Unicode scalar values of the string
let unicode = uni[uni.startIndex].value // First element as an UInt32
print(String(unicode, radix: 16, uppercase: true))
// Output: 1F60A
Note that a Swift Character
represents a "Unicode grapheme cluster"
(compare Strings in Swift 2 from the Swift blog) which can
consist of several "Unicode scalar values". Taking the example
from @TomSawyer's comment below:
let zero = "0️⃣"
let uni = zero.unicodeScalars // Unicode scalar values of the string
let unicodes = uni.map { $0.value }
print(unicodes.map { String($0, radix: 16, uppercase: true) } )
// Output: ["30", "FE0F", "20E3"]
How can I convert a String to an MD5 hash in iOS using Swift?
There are two steps:
1. Create md5 data from a string
2. Covert the md5 data to a hex string
Swift 2.0:
func md5(string string: String) -> String {
var digest = [UInt8](count: Int(CC_MD5_DIGEST_LENGTH), repeatedValue: 0)
if let data = string.dataUsingEncoding(NSUTF8StringEncoding) {
CC_MD5(data.bytes, CC_LONG(data.length), &digest)
}
var digestHex = ""
for index in 0..<Int(CC_MD5_DIGEST_LENGTH) {
digestHex += String(format: "%02x", digest[index])
}
return digestHex
}
//Test:
let digest = md5(string:"Hello")
print("digest: \(digest)")
Output:
digest: 8b1a9953c4611296a827abf8c47804d7
Swift 3.0:
func MD5(string: String) -> Data {
let messageData = string.data(using:.utf8)!
var digestData = Data(count: Int(CC_MD5_DIGEST_LENGTH))
_ = digestData.withUnsafeMutableBytes {digestBytes in
messageData.withUnsafeBytes {messageBytes in
CC_MD5(messageBytes, CC_LONG(messageData.count), digestBytes)
}
}
return digestData
}
//Test:
let md5Data = MD5(string:"Hello")
let md5Hex = md5Data.map { String(format: "%02hhx", $0) }.joined()
print("md5Hex: \(md5Hex)")
let md5Base64 = md5Data.base64EncodedString()
print("md5Base64: \(md5Base64)")
Output:
md5Hex: 8b1a9953c4611296a827abf8c47804d7
md5Base64: ixqZU8RhEpaoJ6v4xHgE1w==
Swift 5.0:
import Foundation
import var CommonCrypto.CC_MD5_DIGEST_LENGTH
import func CommonCrypto.CC_MD5
import typealias CommonCrypto.CC_LONG
func MD5(string: String) -> Data {
let length = Int(CC_MD5_DIGEST_LENGTH)
let messageData = string.data(using:.utf8)!
var digestData = Data(count: length)
_ = digestData.withUnsafeMutableBytes { digestBytes -> UInt8 in
messageData.withUnsafeBytes { messageBytes -> UInt8 in
if let messageBytesBaseAddress = messageBytes.baseAddress, let digestBytesBlindMemory = digestBytes.bindMemory(to: UInt8.self).baseAddress {
let messageLength = CC_LONG(messageData.count)
CC_MD5(messageBytesBaseAddress, messageLength, digestBytesBlindMemory)
}
return 0
}
}
return digestData
}
//Test:
let md5Data = MD5(string:"Hello")
let md5Hex = md5Data.map { String(format: "%02hhx", $0) }.joined()
print("md5Hex: \(md5Hex)")
let md5Base64 = md5Data.base64EncodedString()
print("md5Base64: \(md5Base64)")
Output:
md5Hex: 8b1a9953c4611296a827abf8c47804d7
md5Base64: ixqZU8RhEpaoJ6v4xHgE1w==
Notes:#import <CommonCrypto/CommonCrypto.h>
must be added to a Bridging-Header file
For how to create a Bridging-Header see this SO answer.
In general MD5 should not be used for new work, SHA256 is a current best practice.
Example from deprecated documentation section:
MD2, MD4, MD5, SHA1, SHA224, SHA256, SHA384, SHA512 (Swift 3+)
These functions will hash either String or Data input with one of eight cryptographic hash algorithms.
The name parameter specifies the hash function name as a String
Supported functions are MD2, MD4, MD5, SHA1, SHA224, SHA256, SHA384 and SHA512
a
This example requires Common Crypto
It is necessary to have a bridging header to the project:#import <CommonCrypto/CommonCrypto.h>
Add the Security.framework to the project.
This function takes a hash name and String to be hashed and returns a Data:
name: A name of a hash function as a String
string: The String to be hashed
returns: the hashed result as Data
func hash(name:String, string:String) -> Data? {
let data = string.data(using:.utf8)!
return hash(name:name, data:data)
}
Examples:
let clearString = "clearData0123456"
let clearData = clearString.data(using:.utf8)!
print("clearString: \(clearString)")
print("clearData: \(clearData as NSData)")
let hashSHA256 = hash(name:"SHA256", string:clearString)
print("hashSHA256: \(hashSHA256! as NSData)")
let hashMD5 = hash(name:"MD5", data:clearData)
print("hashMD5: \(hashMD5! as NSData)")
Output:
clearString: clearData0123456
clearData: <636c6561 72446174 61303132 33343536>
hashSHA256: <aabc766b 6b357564 e41f4f91 2d494bcc bfa16924 b574abbd ba9e3e9d a0c8920a>
hashMD5: <4df665f7 b94aea69 695b0e7b baf9e9d6>
Related Topics
How to Make Firebase Database Data the Data Source for Uicollection View
Scenekit - Animation with Dae File Format
How to Properly Implement the Equatable Protocol in a Class Hierarchy
How to Restrict the Type That a Function Throws in Swift
Getting Unresolved Identifier 'Self' in Swiftui Code Trying to Use Timer.Scheduledtimer
Ycombinator Not Working in Swift
How to Connect Aksequencer to a Akcallbackinstrument
Avspeechutterance - Swift - Initializing with a Phrase
Firebase Firestore Not Updating Email Verification Status
Using Swift to Disable Sleep/Screen Saver for Osx
How to View Value of Swift "Let" Constant in Xcode 6 Debugger
Check for Value or Reference Type in Swift
Mkpointannotations Touch Event in Swift
Datepicker Using Time-Interval in Swiftui
Clearing Uiwebview's Cache in Swift
Why Can't I Pass an Implicitly Unwrapped Optional as an Unsafemutablepointer