Convert Character to Int in Swift 2.0
In Swift 2.0, toInt()
, etc., have been replaced with initializers. (In this case, Int(someString)
.)
Because not all strings can be converted to ints, this initializer is failable, which means it returns an optional int (Int?
) instead of just an Int
. The best thing to do is unwrap this optional using if let
.
I'm not sure exactly what you're going for, but this code works in Swift 2, and accomplishes what I think you're trying to do:
let unsolved = "123abc"
var fileLines = [Int]()
for i in unsolved.characters {
let someString = String(i)
if let someInt = Int(someString) {
fileLines += [someInt]
}
print(i)
}
Or, for a Swiftier solution:
let unsolved = "123abc"
let fileLines = unsolved.characters.filter({ Int(String($0)) != nil }).map({ Int(String($0))! })
// fileLines = [1, 2, 3]
You can shorten this more with flatMap
:
let fileLines = unsolved.characters.flatMap { Int(String($0)) }
flatMap
returns "an Array
containing the non-nil results of mapping transform
over self
"… so when Int(String($0))
is nil
, the result is discarded.
How to convert Characters to Int in Swift 4?
for number in numbers!
This line of code extracts each character
in the string
(223 345 567)
and tries to convert it to int
. That's why it converts each valid number in string and left the spaces.
You need to first split the string
in to array
of string
numbers then iterate through the array to convert them.
Split it, further iterate through strNumArray
and convert them to integers
var array = [Int]()
if let strNumArray = numbers?.components(separatedBy: " ") {
for number in strNumArray {
if let integer = Int(String(number)) {
array.append(integer)
}
}
}
print(array)
And in more swifty way
var arrayInt = numbers.split(separator: " ").map{Int($0)}
How to convert a character to an integer in Swift 3
It's as simple as this.
let myString = "5"
let sum = 3 + Int(myString)! // Produces 8
print(Int(myString)!)
Indexing
let str = "My test String Index"
let index = str.index(str.startIndex, offsetBy: 4)
str[index] // Returns e from test
let endIndex = str.index(str.endIndex, offsetBy:-2)
str[Range(index ..< endIndex)] // returns String "est String Ind"
str.substring(from: index) // returns String "est String Index"
str.substring(to: index) // returns String "My t"
Convert String characters to Int
You have to convert the Character to a String first:
for c in numbers.text.characters {
if let digit = Int(String(c)) {
print(digit)
} else {
// Invalid input
}
}
or
let allDigits = numbers.text.characters.flatMap { Int(String($0)) }
(which silently drops non-digit characters).
How to convert 'String.Element' to 'Int'?
What's the issue in sum += i
, as the error said, i
is a Character
, and sum
a Int
.
Can you make addition between bananas & apples? It's the same logic here.
So you might want to have its Int
equivalent with Int(i)
It's returning an optional value, because there is no guarantee that i
is valid. You check isNumber
before hand, but the line itself doesn't know that. So you can soft unwrap, or if you are sure force unwrap:
sum += Int(String(i))! //Char -> String -> Int
Because there is a String.init(someChar)
, and Int.init(someString)
, but not Int.init(someChar)
, that's why there is the double init()
.
BUT, keeping your logic, you are iterating characters per characters...
So, in the end you have:
1 + 2 + 3 + 0 + 6 + 7
(ie 19
), not 1 + 2 + 30 + 67
(ie 100
) as expected.
So if you want to iterate, you need to "group" the consecutive numbers...
With basic for loops, your can do this (it's a possible solution, might no be the better one, but a working one)
let numsInStr = "1abc2x30yz67"
var lastWasNumber = false
var intStrings: [String] = []
for aCharacter in numsInStr {
if aCharacter.isNumber {
if !lastWasNumber {
intStrings.append(String(aCharacter))
} else {
intStrings[intStrings.count - 1] = intStrings[intStrings.count - 1] + String(aCharacter)
}
lastWasNumber = true
} else {
lastWasNumber = false
}
print("After processing: \(aCharacter) - got: \(intStrings)")
}
print(intStrings)
var sum = 0
for anIntString in intStrings {
sum += Int(anIntString)!
}
print("Sum: \(sum)")
At your level, never hesitate to add print()
(but never just the variable, always add an additional text which will be context to know from where it's called).
The output being:
$>After processing: 1 - got: ["1"]
$>After processing: a - got: ["1"]
$>After processing: b - got: ["1"]
$>After processing: c - got: ["1"]
$>After processing: 2 - got: ["1", "2"]
$>After processing: x - got: ["1", "2"]
$>After processing: 3 - got: ["1", "2", "3"]
$>After processing: 0 - got: ["1", "2", "30"]
$>After processing: y - got: ["1", "2", "30"]
$>After processing: z - got: ["1", "2", "30"]
$>After processing: 6 - got: ["1", "2", "30", "6"]
$>After processing: 7 - got: ["1", "2", "30", "67"]
$>["1", "2", "30", "67"]
$>100
We rely on Int(someString)
(and force unwrapping), but sum += Int(anIntString) ?? 0
should be safer. Since for too big values, if you have "a1234567890123456789123456789123456789" for instance, I'm not sure that Int
will be big enough to handle that value. That some edges cases that you need to be aware of.
With high level methods, you can use componentsSeparated(by:)
to get an array of only string & only letters. Then, you can filter()
(if needed), or compactMap()
and transform to Int
if possible, then sum (with reduce(into:_:)
.
As suggested, another solution without keeping a list of String
could be:
var sum = 0
var lastWasNumber = false
var currentIntString = ""
for aCharacter in numsInStr {
if aCharacter.isNumber {
if !lastWasNumber {
sum += Int(currentIntString) ?? 0
currentIntString = "" // Reset, but in fact since we override with the next line, it's not necessary to write it
currentIntString = String(aCharacter)
} else {
currentIntString += String(aCharacter)
}
lastWasNumber = true
} else {
lastWasNumber = false
}
print("After processing: \(aCharacter) - got: \(currentIntString) - current sum: \(sum)")
}
sum += Int(currentIntString) ?? 0
print("Sum: \(sum)")
Here, we keep currentInString
as a "buffer".
This could be simplified too by removing lastWasNumber
and checking instead currentIntString
:
var sum = 0
var currentIntString = ""
for aCharacter in numsInStr {
if aCharacter.isNumber {
if currentIntString.isEmpty {
currentIntString = String(aCharacter)
} else {
currentIntString += String(aCharacter)
}
} else {
sum += Int(currentIntString) ?? 0
currentIntString = ""
}
print("After processing: \(aCharacter) - got: \(currentIntString) - current sum: \(sum)")
}
sum += Int(currentIntString) ?? 0
print("Sum: \(sum)")
How to convert Unicode Character to Int in Swift
Hex to Int
If you are starting with \u{0D85}
and you know the hex value of the Unicode character, then you might as well write it in the following format because it is an Int
already.
let myInt = 0x0D85 // Int: 3461
String to Int
I assume, though, that you have "\u{0D85}"
(in quotes), which makes it a String
by default. And since it is a String
, you can't be certain that you will only have a single Int
value for the general case.
let myString = "\u{0D85}"
for element in myString.unicodeScalars {
let myInt = element.value // UInt32: 3461
}
I could have also used myString.utf16
and let myInt = Int(element)
, but I find it easier to deal with Unicode scalar values (UTF-32) when there is a possibility of things like emoji. (See this answer for more details.)
Character to Int
Swift Character
, which is an extended grapheme cluster, does not have a utf16
or unicodeScalars
property, so if you are starting with Character
then convert it to a String
first and then follow the directions in the String to Int section above.
let myChar: Character = "\u{0D85}"
let myString = String(myChar)
Convert Char in Int swift
Here is a solution using asciiValue
func scanNumber(_ strArray: [Character], _ index: Int, _ flag: Int) -> Int {
var resalt = 0
var arrayIndex = index
while (arrayIndex < strArray.count && (strArray[arrayIndex] >= "0") && (strArray[arrayIndex] <= "9")) {
guard let ascii = strArray[arrayIndex].asciiValue else {
return resalt
}
resalt *= 10
if flag != 2 {
resalt += Int(ascii) - 48
} else {
resalt -= Int(ascii) - 48
}
arrayIndex += 1
}
return resalt
}
I also fixed the loop
How to convert an Int to a Character in Swift
You can't convert an integer directly to a Character
instance, but you can go from integer to UnicodeScalar
to Character
and back again:
let startingValue = Int(("A" as UnicodeScalar).value) // 65
for i in 0 ..< 26 {
print(Character(UnicodeScalar(i + startingValue)))
}
Related Topics
Check If Variable Is an Optional, and What Type It Wraps
Implementing Nscopying in Swift with Subclasses
How to Clear Alamofireimage Setimagewithurl Cache
Swiftui Set Position to Center of Different View
Private Var Is Accessible from Outside the Class
"Cannot Assign Value of Type 'String' to Type 'Anyobject'", Swift 3, Xcode 8 Beta 6
Are Lazy Vars in Swift Computed More Than Once
How to Pause and Resume Nstimer.Scheduledtimerwithtimeinterval in Swift
iOS Charts 3.0 - Align X Labels (Dates) with Plots
Add Animations to Foreach Loop Elements (Swiftui)
Get Image from Calayer or Nsview (Swift 3)
Protocol Extension Initializer
How to Tell If a Node Is on the Screen Spritekit Swift
Background Request Not Execute Alamofire Swift
Scncamera Limit Arcball Rotation