Could Not Find an Overload for "Init" That Accepts the Supplied Arguments in Swift

Could not find an overload for “init” that accepts the supplied arguments SWIFT

UIFont(name:size:) is now a failable initializer -- it will return nil if it can't find that font and crash your app if you unwrap the return value. Use this code to safely get the font and use it:

if let font = UIFont(name: "HelveticaNeue-Light", size: 20) {
self.navigationController?.navigationBar.titleTextAttributes =
[NSFontAttributeName: font,
NSForegroundColorAttributeName: UIColor.whiteColor()]
}

Could not find an overload for 'init' that accepts the supplied arguments

resultLabel.text = "(\area)"

Create a string representation instead.

Could not find an overload for '/' that accepts the supplied arguments

There are no such implicit conversions in Swift, so you'll have to explicitly convert that yourself:

average = Double(total) / Double(numbers.count)

From The Swift Programming Language: “Values are never implicitly converted to another type.” (Section: A Swift Tour)

But you're now using Swift, not Objective-C, so try to think in a more functional oriented way. Your function can be written like this:

func getAverage(numbers: Int...) -> Double {
let total = numbers.reduce(0, combine: {$0 + $1})
return Double(total) / Double(numbers.count)
}

reduce takes a first parameter as an initial value for an accumulator variable, then applies the combine function to the accumulator variable and each element in the array. Here, we pass an anonymous function that uses $0 and $1 to denote the first and second parameters it gets passed and adds them up.

Even more concisely, you can write this: numbers.reduce(0, +).

Note how type inference does a nice job of still finding out that total is an Int.

Could not find an overload for '-' that accepts the supplied arguments

Use Int, not Integer.

Integer is a protocol, not the type.

Could not find an overload for 'init' that accepts the supplied arguments

The error message is misleading. This should work:

var w = Int(self.bounds.size.width / CGFloat(worldSize.width))
var h = Int(self.bounds.size.height / CGFloat(worldSize.height))

The width and height elements of CGSize are declared as CGFloat.
On the 64-bit platform, CGFloat is the same as Double and has 64-bit,
whereas Float has only 32-bit.

So the problem is the division operator,
which requires two operands of the same type. In contrast to (Objective-)C, Swift never implicitly converts values to a
different type.

If worldSize is also a CGSize then you do not need a cast at all:

var w = Int(self.bounds.size.width / worldSize.width)
var h = Int(self.bounds.size.height / worldSize.height)

Swift: Could not find an overload for '|' that accepts the supplied arguments

Use the .value then use the result to create a PFLogInFields instance:

logInViewController.fields = PFLogInFields(PFLogInFieldsUsernameAndPassword.value 
| PFLogInFieldsLogInButton.value)


Related Topics



Leave a reply



Submit