Strange Swift Numbers Type Casting

Strange Swift numbers type casting

Yes, I also found this quite surprising. Double conforms to both FloatLiteralConvertible and IntegerLiteralConvertible (ExpressibleByFloatLiteral and ExpressibleByIntegerLiteral in Swift 3). Therefore a
Double can be initialized with floating point literal

let a = 3.0

or with an integer literal:

let b : Double = 10

(The same is true for other floating point types like Float and
CGFloat.)

Now it might be unexpected for all of us with an (Objective-)C background
that both statements

let x : Double = 10/4     // x = 2.5 .  Really? Yes!
let y = 10/4 as Double // Same here ...

assign the value 0.25 to the variable. From the context, the result of the
division must be a Double and Swift does not implicitly convert types.
Therefore / must be the floating point division operator

func /(lhs: Double, rhs: Double) -> Double

so the compiler creates both arguments as Doubles from the literals
"10" and "4". (If 10/4 were treated as the division of two integers
then the result would also be an integer, and that cannot be assigned
to a Double.)

Note that this is different from

let z = Double(10/4)   // z = 2.0 . (I just thought that I understood it &%$!?)

which does an integer division and converts the result to Double.
Double has an init(_ v: Int) constructor, and therefore 10/4
can be treated as the division of two integers here.

It really looks a bit strange if we summarize these results:

let x : Double = 10/4     // x = 2.5 
let y = 10/4 as Double // y = 2.5
let z = Double(10/4) // z = 2.0

Now we can apply these results to your expression

(10 / 3.0) - (10 / 3)

The first part (10 / 3.0) can only be a Double, therefore -
must be the floating point subtraction operator

func -(lhs: Double, rhs: Double) -> Double

and thus (10 / 3) must also be a Double. Again, / must be the floating point division operator, so 10 and 3 are treated as Double constants.

Therefore the expression is equivalent to

(Double(10) / 3.0) - (Double(10) / Double(3))

and evaluates to 0.0. If you change the expression to

(10 / 3.0) - Double(10 / 3)

then the result is 0.333... because in this context, 10 / 3
is the division of two integer constants, as explained above.

Why does swift conversion work for floating point division?

1 and 2 are literals. They have no type unless you give them a type from context.

let n6 = (1 / 2) as Double

is essentially the same as

let n6: Double = 1 / 2

that means, you tell the compiler that the result is a Double. That means the compiler searches for operator / with a Double result, and that means it will find the operator / on two Double operands and therefore considers both literals as of type Double.

On the other hand,

let n5 = Double(1 / 2)

is a cast (or better said, initialization of a Double). That means the expression 1 / 2 gets evaluated first and then converted to Double.

Need help on type inference in swift

When you write var v3 = 2+2.5 Swift has to infer the type of 2, a numeric literal compatible both with Int and Double. The compiler is able to do so, because there's 2.5 in the same expression, which is a Double. Hence, the compiler concludes that 2 must also be a Double. The compiler computes the sum, and sets v3 to 4.5. No addition is performed at runtime.

When you write var v2 = 2 the compiler treats 2 as an Int, making v1 and Int as well. Now there is an addition on the var v3:Double = v1 + v2, which fails because v1 and v2 have mismatched types.

If you declare var v2:Double = 2 instead, the problem will be fixed.

Understanding the Doubles and Ints in Swift

As you know, Swift is very strict with types, but there's one area where it's not so strict - literals. Double conforms to ExpressibleByIntegerLiteral, so you could do:

let x: Double = 1 // "1" is "magically" converted to a Double!?

and have it compile. The same with arrays - the compiler thinks that the array literal that you have:

[50, 5.0, 10]

is a [Double], because it can convert both 50 and 10 to Double. It can't be an [Int] because 5.0 can't be converted to an Int (Int does not conform to ExpressibleByFloatLiteral)

The line:

total += amounts[i]

only works when both sides are of the same type. Note that here, the compiler will not try to convert from Int to Double because the expressions involved (total and amounts[i]) are not literals!

If you change the array literal to [50, 10, 10], all elements are Int, so the compiler infers the array to be [Int], and amount[i] becomes an Int, causing the line to fail compilation.

How does Swift's int literal to float inference work?

Literals don't have a type as such. The docs say,

If there isn’t suitable type information available, Swift infers that the
literal’s type is one of the default literal
types defined in the Swift standard library. The default types are Int
for integer literals, Double for floating-point literals, String for
string literals, and Bool for Boolean literals.

So unless your argument explicity says anything other than Int, it will infer integer literals as Int.


Refer this for more information, Lexical Structure - Literals.

CGFloat variable returns 0.0

If you leave out converting, this will work perfectly.

let progress: CGFloat = 2 / 3
print(progress) //0.666666666666667

The reason why this does not work with explicit converting is beause Swift treats a whole number as an Int if it's without context.

That's exactly what is happening inside converting brackets.



Related Topics



Leave a reply



Submit