Could Not Find an Overload for "Init" That Accepts The Supplied Arguments (Swift)

Could not find an overload init' that accept the supplied argument

Array subscript takes an Int, not a UInt32. You need to convert back:

let x = jokes[Int(arc4random_uniform(UInt32(jokes.count)))]

If this gets a bit noisy for you, you may want to create a function to handle it:

func randomValueLessThan(x: Int) -> Int {
return Int(arc4random_uniform(UInt32(x)))
}

Or you could extend Array to help out:

extension Array {
func uniformSelection() -> T {
return self[Int(arc4random_uniform(UInt32(self.count)))]
}
}

EDIT: It is worth digging a little deeper into the Int(arc4random()%jokes.count) case because you might be tempted to fix it incorrectly, and it demonstrates why Swift works the way it does.

Let's start with your version

let n = Int(arc4random() % jokes.count)
// => Could not find an overload for 'init' that accepts the supplied arguments

That's a little confusing. Let's simplify to see the problem

let n = arc4random() % jokes.count
// => Cannot invoke '%' with an argument list of type '(UInt32, Int)'

That should be clearer. arc4random returns a UInt32 and jokes.count() returns an Int. You can't modulus different types. You need to get them to the same place. Well, we want an Int, right? Seems easy:

let n = Int(arc4random()) % jokes.count   // WARNING!!!! Never ever do this!!!!

Why is Apple so pedantic and forces us to do that by hand? Couldn't the compiler just cast it automatically? Well, the above code will work fine on a 64-bit processor and crash about half the time on a 32-bit processor. That's because by calling Int(), you're promising that the value will always be in the Int range. And on a 32-bit processor, that's the 32-bit signed range. But arc4random returns a value in the whole 32-bit unsigned range, which includes lots of numbers that won't fit in an Int. Blam! (Or if you turn off bounds checking, then it just trashes your numbers like it does in C, which is no better.)

That's why Swift is picky about integer conversions. You need to be absolutely certain when you convert that it's a safe conversion. You shouldn't just sprinkle them around until it compiles.

That said, of course you should never use modulus on arc4random. But that's a different question.

As one more side note, you'll notice that the numeric casting in randomValueLessThan() creates many possible invalid situations. If you pass a number less than 0 you'll crash (that shouldn't be surprising, though). If you pass a number greater than UInt32.Max, you'll also crash, which is slightly more surprising, but pretty unlikely in most code. The point of this is that by adding these casts, we've made randomValueLessThan a partial function. It's not defined over all of its range of inputs (it's "domain"). In "real life" programming, we do that all the time and we just hope it never bites us. But Swift is trying to help us get bitten even less by making it more obvious when you're breaking type safety.

uniformSelection has a similar problem. It's only defined for arrays with fewer than UInt32.Max elements. These sometimes feel like meaningless corner cases, and they are, until suddenly they aren't and your program crashes. (Array.subscript is also a partial function, since it is undefined for values outside the array's range. I actually suggested to Apple that Array.subscript return an optional to account for that. They were probably wise to ignore me.)

Could not find an overload for 'init' that accepts the supplied arguments

The error message is misleading. This should work:

var w = Int(self.bounds.size.width / CGFloat(worldSize.width))
var h = Int(self.bounds.size.height / CGFloat(worldSize.height))

The width and height elements of CGSize are declared as CGFloat.
On the 64-bit platform, CGFloat is the same as Double and has 64-bit,
whereas Float has only 32-bit.

So the problem is the division operator,
which requires two operands of the same type. In contrast to (Objective-)C, Swift never implicitly converts values to a
different type.

If worldSize is also a CGSize then you do not need a cast at all:

var w = Int(self.bounds.size.width / worldSize.width)
var h = Int(self.bounds.size.height / worldSize.height)

Could not find an overload for “init” that accepts the supplied arguments SWIFT

UIFont(name:size:) is now a failable initializer -- it will return nil if it can't find that font and crash your app if you unwrap the return value. Use this code to safely get the font and use it:

if let font = UIFont(name: "HelveticaNeue-Light", size: 20) {
self.navigationController?.navigationBar.titleTextAttributes =
[NSFontAttributeName: font,
NSForegroundColorAttributeName: UIColor.whiteColor()]
}

Could not find an overload for “init” that accepts the supplied arguments (swift)

String class overload init method for double.

let s = NSString(format: "%.2f", counter)

Could not find an overload for 'init' that accepts the supplied arguments

resultLabel.text = "(\area)"

Create a string representation instead.

Could not find an overload for '/' that accepts the supplied arguments

There are no such implicit conversions in Swift, so you'll have to explicitly convert that yourself:

average = Double(total) / Double(numbers.count)

From The Swift Programming Language: “Values are never implicitly converted to another type.” (Section: A Swift Tour)

But you're now using Swift, not Objective-C, so try to think in a more functional oriented way. Your function can be written like this:

func getAverage(numbers: Int...) -> Double {
let total = numbers.reduce(0, combine: {$0 + $1})
return Double(total) / Double(numbers.count)
}

reduce takes a first parameter as an initial value for an accumulator variable, then applies the combine function to the accumulator variable and each element in the array. Here, we pass an anonymous function that uses $0 and $1 to denote the first and second parameters it gets passed and adds them up.

Even more concisely, you can write this: numbers.reduce(0, +).

Note how type inference does a nice job of still finding out that total is an Int.

Could not find an overload 'init' that accept the supplied argument

Because, UIImage(named:) returns Optional. And Optional<UIImage> is not convertible to AnyObject.

You can force unwrap them:

var segmentedControlImages: [AnyObject] = [
UIImage(named: "likeIcon")!,
UIImage(named: "dislikeIcon")!
]


Related Topics



Leave a reply



Submit