Could not find an overload init' that accept the supplied argument
Array subscript takes an Int
, not a UInt32
. You need to convert back:
let x = jokes[Int(arc4random_uniform(UInt32(jokes.count)))]
If this gets a bit noisy for you, you may want to create a function to handle it:
func randomValueLessThan(x: Int) -> Int {
return Int(arc4random_uniform(UInt32(x)))
}
Or you could extend Array to help out:
extension Array {
func uniformSelection() -> T {
return self[Int(arc4random_uniform(UInt32(self.count)))]
}
}
EDIT: It is worth digging a little deeper into the Int(arc4random()%jokes.count)
case because you might be tempted to fix it incorrectly, and it demonstrates why Swift works the way it does.
Let's start with your version
let n = Int(arc4random() % jokes.count)
// => Could not find an overload for 'init' that accepts the supplied arguments
That's a little confusing. Let's simplify to see the problem
let n = arc4random() % jokes.count
// => Cannot invoke '%' with an argument list of type '(UInt32, Int)'
That should be clearer. arc4random
returns a UInt32
and jokes.count()
returns an Int
. You can't modulus different types. You need to get them to the same place. Well, we want an Int
, right? Seems easy:
let n = Int(arc4random()) % jokes.count // WARNING!!!! Never ever do this!!!!
Why is Apple so pedantic and forces us to do that by hand? Couldn't the compiler just cast it automatically? Well, the above code will work fine on a 64-bit processor and crash about half the time on a 32-bit processor. That's because by calling Int()
, you're promising that the value will always be in the Int
range. And on a 32-bit processor, that's the 32-bit signed range. But arc4random
returns a value in the whole 32-bit unsigned range, which includes lots of numbers that won't fit in an Int
. Blam! (Or if you turn off bounds checking, then it just trashes your numbers like it does in C, which is no better.)
That's why Swift is picky about integer conversions. You need to be absolutely certain when you convert that it's a safe conversion. You shouldn't just sprinkle them around until it compiles.
That said, of course you should never use modulus on arc4random
. But that's a different question.
As one more side note, you'll notice that the numeric casting in randomValueLessThan()
creates many possible invalid situations. If you pass a number less than 0 you'll crash (that shouldn't be surprising, though). If you pass a number greater than UInt32.Max
, you'll also crash, which is slightly more surprising, but pretty unlikely in most code. The point of this is that by adding these casts, we've made randomValueLessThan
a partial function. It's not defined over all of its range of inputs (it's "domain"). In "real life" programming, we do that all the time and we just hope it never bites us. But Swift is trying to help us get bitten even less by making it more obvious when you're breaking type safety.
uniformSelection
has a similar problem. It's only defined for arrays with fewer than UInt32.Max
elements. These sometimes feel like meaningless corner cases, and they are, until suddenly they aren't and your program crashes. (Array.subscript
is also a partial function, since it is undefined for values outside the array's range. I actually suggested to Apple that Array.subscript
return an optional to account for that. They were probably wise to ignore me.)
Swift error: Could not find an overload for '==' that accepts the supplied arguments?
You can doif (status == .NotReachable)
or if (status == NetworkStatus.NotReachable)
.
Could not find an overload for '*' that accepts the supplied argument
Swift seems to be fairly picky about implied type casting, so in your example you're multiplying str (an Integer) by 0.01 (a Double) so to resolve the error, you'll need to cast it like this:
var str: Int = 0
var pennyCount = 0.00
str = pennyTextField.text.toInt()!
pennyCount = Double(str) * 0.01
Could not find an overload for '-' that accepts the supplied arguments
Use Int
, not Integer
.
Integer
is a protocol, not the type.
Simple math operations in Swift not working as they should
It's all a matter of where you apply the cast:
let timeInterval = -now.timeIntervalSinceNow
let seconds = Int(timeInterval) % 60
let minutes = (Int(timeInterval) - seconds) / 60
let centiSeconds = timeInterval - floor(timeInterval)) * 100
The underlying issue is that now.timeIntervalSinceNow
returns an NSTimeInterval which is a floating point. As a result, the following is a compilation error:
let minutes = Int(timeInterval - seconds) / 60
However, the error Xcode reports Could not find and overload for 'init' that accepts the supplied arguments
is incorrect and is not the root cause here.
Swift error: Could not find an overload for '&&' that accepts the supplied arguments
If you look at the full compiler output in the Report Navigator then you will see the
message
note: expression was too complex to be solved in reasonable time;
consider breaking up the expression into distinct sub-expressions
which tells you how to solve the problem.
error - could not find an overload for '-' that accepts the supplied arguments
Change it to this:
var lastContentOffset = scrollView.contentOffset.y
if (lastContentOffset * -1) > 64 {
//do something
}
Functions returns error about overloading operator
It will be OK to use 230/255.0
, because the type of the number will be determined after the calculation. If you assign the two number to a variable (or constant), you will be have to convert them before calculating. So there is no problem in the 230/255.0
. (And in fact, CGFloat
is a Double
..not a Float
)
It seems there is a fatal bug for me to define the AlphaLevel
enum in the extension of UIColor
. Swift will crash if I do so (although it should be possible). Anyway...
The problem in your code is the alpha
you passed into the hazeColor
method is a enum of AlphaLevel
, instead of CGFloat
, so the type check failed. Just modify the UIColor(red: 230/255.0, green: 235/255.0, blue: 245/255.0, alpha: alpha)
to UIColor(red: 230/255.0, green: 235/255.0, blue: 245/255.0, alpha: alpha.toRaw())
, you can get around.
Confusion due to Swift lacking implicit conversion of CGFloat
This is a problem with double
to float
conversion.
On a 64-bit machine, CGFloat
is defined as double
and you will compile it without problems because M_PI
and x
are both doubles.
On a 32-bit machine, CGFloat
is a float
but M_PI
is still a double. Unfortunately, there are no implicit casts in Swift, so you have to cast explicitly:
return (CGFloat(M_PI) * (x) / 180.0)
The type for 180.0
literal is inferred.
In Swift 3
M_PI
is deprecated, use CGFloat.pi
instead:
return (x * .pi / 180.0)
Related Topics
Uicollectionviewlayout Not Working with Uiimage in Swift 5 and Xcode 11
Why Can't the Swift Compiler Infer This Closure's Type
How to Return a First Word from a String in Swift
How to Rewrite Swift ++ Operator in : Ternary Operator
Swift 4.0 Mapview Running Slow
Swift Error: 'Missing Return in Function'
Single-Element Parethesized Expressions/Tuples VS Common Use of Parentheses
How to Add Kerning to a Textfield in Swiftui
Call Completion Block When Two Other Completion Blocks Have Been Called
Division Not Working Properly in Swift
Why How to Use Codable with a Project Language Version of Swift 3.3
How to Limit Flatmap Concurrency in Combine Still Having All Source Events Processed
Find Nearest Smaller Number in Array
Swift Enum Loses Initialized Values When Set as a Property
Bool Being Seen as Int When Using Anyobject
Binding an Element of an Array of an Observableobject:'Subscript(_:)' Is Deprecated