Swift 3 Type Inference Confusion

Swift 3 type inference confusion

Updated Answer - For macOS

With Xcode 8 beta 6, Swift no longer implicitly bridges Swift value types to Foundation class types. Which means if a function is expecting an NSNumber and you pass it an Int variable, you will have to explicitly cast it to NSNumber. This is not necessary for an integer literal because Swift will still infer the type correctly.

Why does 1 compile, but 2 does not?

1 compiles because Swift is able to infer the type of 20 to be NSNumber, so ["a": 20] works as a [String: NSNumber].

2 doesn't compile, because the type of a is already established as Int, so you need to explicitly convert that to NSNumber. Xcode's fix-it suggests NSNumber(a), but sadly that doesn't compile. Use NSNumber(value: a) or a as NSNumber.

Why do 2 and 3 have different error messages?

For 2, you are providing a dictionary literal ["a": a] so Swift examines the types of each key and value to see if it matches the types the dictionary it expects. Since a is an Int and the value is a NSNumber, you get the error Cannot convert value of type 'Int' to expected dictionary value type 'NSNumber'. It wants you to provide the conversion.

For 3, you are providing a variable of type [String, Int]. Swift tells you that it can't convert that to [String, NSNumber]. It can, but not without an explicit cast due to the change in Xcode 8 beta 6.

Why does 4 compile, but 5 does not?

4 compiles because you are now providing the explicit cast to [String: NSNumber] that 3 lacked.

5 does not compile because again you are providing a dictionary literal and Swift examines each of the keys and values to make sure they are the right types. It will not convert the Int to an NSNumber without an explicit cast, so the error here is Cannot convert value of type 'Int' to expected dictionary value type 'NSNumber'. The point is that Swift will not cast the individual keys and values of a dictionary literal when you cast it to a dictionary type. You have to provide that cast directly for each one.



Previous Answer - For iOS

With Xcode 8 beta 6, the type of the argument metrics has changed to [String: Any]?. Now, the first 4 examples compile, and the 5th does not. Your first two questions are no longer valid. The only question left is:

Why does 4 compile, but 5 does not?

Statement 4 (met as [String: NSNumber]) compiles because met has type [String: Int] and Swift can cast [String: Int] to [String: NSNumber]. In this case, it is looking at the dictionary as a whole. Swift knows how to convert an Int to an NSNumber, but it won't do so without you asking it to do so explicitly. In this case, since you are presenting a dictionary of type [String: Int] and asking it to convert that to [String: NSNumber], you are asking it to convert the Int to an NSNumber.

In statement 5, you are casting a dictionary literal ["a": a] to a dictionary type as [String: NSNumber]. The error message is:

Cannot convert value of type 'Int' to expected dictionary value type 'NSNumber'

In this case, Swift is looking at the individual types, checking to see that "a" is a String and a is a NSNumber. Casting a dictionary literal to a type does not explicitly cast each key and value to the corresponding type. In that case, you are merely presenting them and saying that they are already that type. Due to a new change in Xcode 8 beta 6, Swift will no longer implicitly convert Swift value types to bridged Foundation types. So Swift wants you to explicitly convert the Int a to an NSNumber.

There are two ways to make Swift happy:

["a": NSNumber(value: a)] as [String: NSNumber]
["a": a as NSNumber] as [String: NSNumber]

Of course, now in both cases the dictionary literal can be inferred to be [String: NSNumber] so the cast in unnecessary.

Also, since metrics is now [String: Any], it makes no sense to convert ["a": a] to [String: NSNumber] when [String: Int] would do.

Swift type inference and basic addition

The + operator does not support adding a Double and and Integer together in this way

If you change up your code to make sure wholeNumber is a Double type, then it'll work

let partNumber = 3.2
let wholeNumber: Double = 2
let result = partNumber + wholeNumber

Swift type inference in methods that can throw and cannot

Obviously you should never, ever write code like this. It's has way too many ways it can bite you, and as you see, it is. But let's see why.

First, try is just a decoration in Swift. It's not for the compiler. It's for you. The compiler works out all the types, and then determines whether a try was necessary. It doesn't use try to figure out the types. You can see this in practice here:

class X {
func x() throws -> X {
return self
}
}

let y = try X().x().x()

You only need try one time, even though there are multiple throwing calls in the chain. Imagine how this would work if you'd created overloads on x() based on throws vs non-throws. The answer is "it doesn't matter" because the compiler doesn't care about the try.

Next there's the issue of type inference vs type coercion. This is type inference:

let resul1 = myClass.fun1() // error: ambiguous use of 'fun1()'

Swift will never infer an ambiguous type. This could be Any or it could beInt`, so it gives up.

This is not type inference (the type is known):

let result1: Int = myClass.fun1() // OK

This also has a known, unambiguous type (note no ?):

let x : Any = try myClass.fun1()

But this requires type coercion (much like your print example)

let x : Any = try? myClass.fun1() // Expression implicitly coerced from `Int?` to `Any`
// No calls to throwing function occur within 'try' expression

Why does this call the Int version? try? return an Optional (which is an Any). So Swift has the option here of an expression that returns Int? and coercing that to Any or Any? and coercing that to Any. Swift pretty much always prefers real types to Any (and it properly hates Any?). This is one of the many reasons to avoid Any in your code. It interacts with Optional in bizarre ways. It's arguable that this should be an error instead, but Any is such a squirrelly type that it's very hard to nail down all its corner cases.

So how does this apply to print? The parameter of print is Any, so this is like the let x: Any =... example rather than like the let x =... example.

A few automatic coercions to keep in mind when thinking about these things:

  • Every T can be trivially coerced to T?
  • Every T can be explicitly coerced to Any
  • Every T? can also be explicitly coerce to Any

    • Any can be trivially coerced to Any? (also Any??, Any???, and Any????, etc)
    • Any? (Any??, Any???, etc) can be explicitly coerced to Any
  • Every non-throwing function can be trivially coerced to a throwing version

    • So overloading purely on "throws" is dangerous

So mixing throws/non-throws conversions with Any/Any? conversions, and throwing try? into the mix (which promotes everything into an optional), you've created a perfect storm of confusion.

Obviously you should never, ever write code like this.

Ambiguous type inference

I'm going with this being a compiler bug and have opened a bug report. For anyone else experiencing a similar issue, I've managed to work around it by removing the Self requirement from the Generic extension:

class Root { }

protocol Generic {
associatedtype RootType: Root
var root: RootType { get }
}

extension Generic where Self: Root {
// ===================
// don't use Self here
// ===================
var root: Root {
return self
}
}

class GenericWrapper<T: Generic>: Root {
var generic: T

init(generic: T) {
self.generic = generic
}
}

protocol Specialised { }
extension Specialised where Self: Generic {
var root: GenericWrapper<Self> {
get {
return GenericWrapper(generic: self)
}
}
}

class SpecialisedImplementation: Generic, Specialised {
// no errors!
}

What is the difference between Type Safety and Type Inference?

From Swift's own documentation:

Type Safety

Swift is a type-safe language. A type safe language encourages you to be clear about the types of values your code can work with. If part of your code expects a String, you can’t pass it an Int by mistake.

var welcomeMessage: String
welcomeMessage = 22 // this would create an error because you
//already specified that it's going to be a String

Type Inference

If you don’t specify the type of value you need, Swift uses type inference to work out the appropriate type. Type inference enables a compiler to deduce the type of a particular expression automatically when it compiles your code, simply by examining the values you provide.

var meaningOfLife = 42 // meaningOfLife is inferred to be of type Int
meaningOfLife = 55 // it Works, because 55 is an Int

Type Safety & Type Inference together

var meaningOfLife = 42 // 'Type inference' happened here, we didn't specify that this an Int, the compiler itself found out.
meaningOfLife = 55 // it Works, because 55 is an Int
meaningOfLife = "SomeString" // Because of 'Type Safety' ability you will get an
//error message: 'cannot assign value of type 'String' to type 'Int''


Tricky example for protocols with associated types:

Imagine the following protocol

protocol Identifiable {
associatedtype ID
var id: ID { get set }

}

You would adopt it like this:

struct Person: Identifiable {
typealias ID = String
var id: String
}

However you can also adopt it like this:

struct Website: Identifiable {
var id: URL
}

You can remove the typealias. The compiler will still infer the type.

For more see Generics - Associated Types

Thanks to Swift’s type inference, you don’t actually need to declare a
concrete Item of Int as part of the definition of IntStack. Because
IntStack conforms to all of the requirements of the Container
protocol, Swift can infer the appropriate Item to use, simply by
looking at the type of the append(_:) method’s item parameter and the
return type of the subscript. Indeed, if you delete the typealias Item
= Int line from the code above, everything still works, because it’s clear what type should be used for Item.

Type-safety and Generics

Suppose you have the following code:

struct Helper<T: Numeric> {
func adder(_ num1: T, _ num2: T) -> T {
return num1 + num2
}
var num: T
}

T can be anything that's numeric e.g. Int, Double, Int64, etc.

However as soon as you type let h = Helper(num: 10) the compiler will assume that T is an Int. It won't accept Double, Int64, for its adder func anymore. It will only accept Int.

This is again because of type-inference and type-safety.

  • type-inference: because it has to infer that that the generic is of type Int.
  • type-safety: because once the T is set to be of type Int, it will no longer accept Int64, Double...

As you can see in the screenshot the signature is now changed to only accept a parameter of type Int
Sample Image

Pro tip to optimize compiler performance:

The less type inference your code has to do the faster it compiles. Hence it's recommended to avoid collection literals. And the longer a collection gets, the slower its type inference becomes...

not bad

let names = ["John", "Ali", "Jane", " Taika"]

good

let names : [String] = ["John", "Ali", "Jane", " Taika"]

For more see this answer.

Also see Why is Swift compile time so slow?

The solution helped his compilation time go down from 10/15 seconds to a single second.

Confusion due to Swift lacking implicit conversion of CGFloat

This is a problem with double to float conversion.

On a 64-bit machine, CGFloat is defined as double and you will compile it without problems because M_PI and x are both doubles.

On a 32-bit machine, CGFloat is a float but M_PI is still a double. Unfortunately, there are no implicit casts in Swift, so you have to cast explicitly:

return (CGFloat(M_PI) * (x) / 180.0)

The type for 180.0 literal is inferred.

In Swift 3

M_PI is deprecated, use CGFloat.pi instead:

return (x * .pi / 180.0)

Type Inference of IIFE in Swift

I believe I've seen some something referring to Swift's implementation of this construct as "called closures", but I'm failing to find it now.

Note that your first example:

let one = { 1 }

... is not a called closure. It defines a closure, but does not invoke it. Because it has no parameters, you call it at the same time you define it by placing an empty parameter list after the braces:

let one = { 1 }()

You should be able to do this with type inference, too:

let one = { $0 }(1)  // one: (Int) = 1

let i = 1 // i: Int = 1
let one = { $0 }(i) // one: (Int) = 1

These work for me on beta 5, though not if i is declared with var instead of let. (Any time you see the compiler or SourceKit crash it's probably good to file a bug.)


Called closures can be great for setting up lazily-initialized stored properties -- the first get against the property runs the closure For example, you'll notice them in the Xcode project templates for setting up a Core Data stack:

lazy var managedObjectContext: NSManagedObjectContext? = {
// Returns the managed object context for the application (which is already bound to the persistent store coordinator for the application.) This property is optional since there are legitimate error conditions that could cause the creation of the context to fail.
let coordinator = self.persistentStoreCoordinator
if coordinator == nil {
return nil
}
var managedObjectContext = NSManagedObjectContext()
managedObjectContext.persistentStoreCoordinator = coordinator
return managedObjectContext
}()

In this case the type has to be explicitly given -- because the closure can return nil, its return type has to be an optional, but the type checker can't tell what kind of optional.



Related Topics



Leave a reply



Submit