Could not find an overload for “init” that accepts the supplied arguments SWIFT
UIFont(name:size:)
is now a failable initializer -- it will return nil
if it can't find that font and crash your app if you unwrap the return value. Use this code to safely get the font and use it:
if let font = UIFont(name: "HelveticaNeue-Light", size: 20) {
self.navigationController?.navigationBar.titleTextAttributes =
[NSFontAttributeName: font,
NSForegroundColorAttributeName: UIColor.whiteColor()]
}
Could not find an overload for 'init' that accepts the supplied arguments
resultLabel.text = "(\area)"
Create a string representation instead.
Could not find an overload for '/' that accepts the supplied arguments
There are no such implicit conversions in Swift, so you'll have to explicitly convert that yourself:
average = Double(total) / Double(numbers.count)
From The Swift Programming Language: “Values are never implicitly converted to another type.” (Section: A Swift Tour)
But you're now using Swift, not Objective-C, so try to think in a more functional oriented way. Your function can be written like this:
func getAverage(numbers: Int...) -> Double {
let total = numbers.reduce(0, combine: {$0 + $1})
return Double(total) / Double(numbers.count)
}
reduce
takes a first parameter as an initial value for an accumulator variable, then applies the combine
function to the accumulator variable and each element in the array. Here, we pass an anonymous function that uses $0
and $1
to denote the first and second parameters it gets passed and adds them up.
Even more concisely, you can write this: numbers.reduce(0, +)
.
Note how type inference does a nice job of still finding out that total
is an Int
.
Could not find an overload for '-' that accepts the supplied arguments
Use Int
, not Integer
.
Integer
is a protocol, not the type.
Could not find an overload for 'init' that accepts the supplied arguments
The error message is misleading. This should work:
var w = Int(self.bounds.size.width / CGFloat(worldSize.width))
var h = Int(self.bounds.size.height / CGFloat(worldSize.height))
The width
and height
elements of CGSize
are declared as CGFloat
.
On the 64-bit platform, CGFloat
is the same as Double
and has 64-bit,
whereas Float
has only 32-bit.
So the problem is the division operator,
which requires two operands of the same type. In contrast to (Objective-)C, Swift never implicitly converts values to a
different type.
If worldSize
is also a CGSize
then you do not need a cast at all:
var w = Int(self.bounds.size.width / worldSize.width)
var h = Int(self.bounds.size.height / worldSize.height)
Swift: Could not find an overload for '|' that accepts the supplied arguments
Use the .value
then use the result to create a PFLogInFields
instance:
logInViewController.fields = PFLogInFields(PFLogInFieldsUsernameAndPassword.value
| PFLogInFieldsLogInButton.value)
Related Topics
iOS Swift Multiple Dimension Arrays - Compiliing Takes Ages. What Should I Change
Why Can't I Invert My Image Back to Original with Cifilter in My Swift iOS App
Uitableviewcells with Uibutton Overlaps While Scrolling
Coca Pod Chart Not Appearing (Swift4)
Get Random Child from Firebase Database
Getting Back a Date from a String
How to Disable the Network in iOS Simulator
iOS 7 Parallax Effect in My View Controller
Xcode Process Launch Failed: Security
Specifying One Dimension of Cells in Uicollectionview Using Auto Layout
Implementing Fast and Efficient Core Data Import on iOS 5
Fixing Xcode 9 Issue: "iPhone Is Busy: Preparing Debugger Support for Iphone"
Xcode 10: a Valid Provisioning Profile for This Executable Was Not Found
Updating to Latest Version of Cocoapods
I Get Conflicting Provisioning Settings Error When I Try to Archive to Submit an iOS App