How to Get String from Ascii Code in Swift

How to get string from ASCII code in Swift?

As a character:

let c = Character(UnicodeScalar(65))

Or as a string:

let s = String(UnicodeScalar(UInt8(65)))

Or another way as a string:

let s = "\u{41}"

(Note that the \u escape sequence is in hexadecimal, not decimal)

How to convert sequence of ASCII code into string in swift 4?

As already mentioned decimal ASCII values are in range of 0-255 and can be more than 2 digits

Based on Sulthan's answer and assuming there are no characters < 32 (0x20) and > 199 (0xc7) in the text this approach checks the first character of the cropped string. If it's "1" the character is represented by 3 digits otherwise 2.

func convertAscii(asciiStr: String) {
var source = asciiStr

var result = ""

while source.count >= 2 {
let digitsPerCharacter = source.hasPrefix("1") ? 3 : 2
let charBytes = source.prefix(digitsPerCharacter)
source = String(source.dropFirst(digitsPerCharacter))
let number = Int(charBytes)!
let character = UnicodeScalar(number)!
result += String(character)
}

print(result) // "Happy Birthday"
}

convertAscii(asciiStr: "7297112112121326610511411610410097121")

Converting to Char/String from Ascii Int in Swift

It may not be as clean as Java, but you can do it like this:

var string = ""
string.append(Character(UnicodeScalar(50)))

You can also modify the syntax to look more similar if you like:

//extend Character so it can created from an int literal
extension Character: IntegerLiteralConvertible {
public static func convertFromIntegerLiteral(value: IntegerLiteralType) -> Character {
return Character(UnicodeScalar(value))
}
}

//append a character to string with += operator
func += (inout left: String, right: Character) {
left.append(right)
}

var string = ""
string += (50 as Character)

Or using dasblinkenlight's method:

func += (inout left: String, right: Int) {
left += "\(UnicodeScalar(right))"
}
var string = ""
string += 50

Swift How to get an ASCII value from a character

You can't get it from the String but from a Character in a string, for instance

str.first?.asciiValue

How to write all ASCII characters in Swift String

In general, you would use \u{x} where x is the hex value. In your case \u{0} through \u{1f} and \u{7f}.

As in C based languages, Swift strings also supports \0 for "null", \t for "tab", \n for "newline", and \r for "carriage return". Unlike C, Swift does not support \b or \f.

If you want to create single String will all 128 ASCII characters then you can do:

let ascii = String(Array(0...127).map { Character(Unicode.Scalar($0)) })

Getting character ASCII value as an Integer in Swift

In Swift, a Character may not necessarily be an ASCII one. It would for example have no sense to return the ascii value of "quot; which requires a large unicode encoding. This is why asciiValue property has an optional UInt8 value, which is annotated UInt8?.

The simplest solution

Since you checked yourself that the character isAscii, you can safely go for an unconditional unwrapping with !:

var tempo:Int = Int(n.asciiValue!)     // <--- just change this line

A more elegant alternative

You could also take advantage of optional binding that uses the fact that the optional is nil when there is no ascii value (i.e. n was not an ASCII character):

if let tempo = n.asciiValue   // is true only if there is an ascii value
{
temp += (Int(tempo) | key)
}

What's the simplest way to convert from a single character String to an ASCII value in Swift?

edit/update Swift 5.2 or later

extension StringProtocol {
var asciiValues: [UInt8] { compactMap(\.asciiValue) }
}

"abc".asciiValues  // [97, 98, 99]

In Swift 5 you can use the new character properties isASCII and asciiValue

Character("a").isASCII       // true
Character("a").asciiValue // 97

Character("á").isASCII // false
Character("á").asciiValue // nil

Old answer

You can create an extension:

Swift 4.2 or later

extension Character {
var isAscii: Bool {
return unicodeScalars.allSatisfy { $0.isASCII }
}
var ascii: UInt32? {
return isAscii ? unicodeScalars.first?.value : nil
}
}

extension StringProtocol {
var asciiValues: [UInt32] {
return compactMap { $0.ascii }
}
}

Character("a").isAscii  // true
Character("a").ascii // 97

Character("á").isAscii // false
Character("á").ascii // nil

"abc".asciiValues // [97, 98, 99]
"abc".asciiValues[0] // 97
"abc".asciiValues[1] // 98
"abc".asciiValues[2] // 99

Inserting ASCII symbols into a String (Swift)

ASCII defines 128 characters from 0x00 to 0x7F. 0xFF (255) is not included.

In Unicode, U+00FF (in Swift, "\u{ff}") represents "ÿ" (LATIN SMALL LETTER Y WITH DIARESIS).
And its UTF-8 representation is 0xC3 0xBF. See UTF-8, characters with code point from U+0080 to U+07FF are represented with two-byte sequence.
Also you need to know that 0xFF is not a valid byte in UTF-8 byte sequence, which means you cannot get any 0xFF bytes in UTF-8 text file.

If you want to output "\u{ff}" as a single-byte 0xFF, use ISO-8859-1 (aka ISO-Latin-1) instead:

try! s.write(toFile: "output.txt", atomically: true, encoding: .isoLatin1)


Related Topics



Leave a reply



Submit