Swift UIColor initializer - compiler error only when targeting iPhone5s
It probably has to do with a mismatch between Float and Double. Basically, if the arguments are expected to be CGFloats, you need to pass CGFloats.
let color = UIColor(red: CGFloat(red), green: CGFloat(green), blue: CGFloat(blue), alpha: CGFloat(alpha))
What's important to understand here is that CGFloat is defined as Float on 32 bit platforms, and Double on 64 bit platforms. So, if you're explicitly using Floats, Swift won't have a problem with this on 32 Bit platforms, but will produce an error on 64 bit.
Extra argument in call when using var
Answer was: Swift UIColor initializer - compiler error only when targeting iPhone5s
Use float instead of integers.
UIColor(red: 1.0, green:0.0, blue: 0.0, alpha:alpha)
UIColor initializer (cannot find an initializer of type 'UIColor' that accepts ....) error
Use CGFloat
not Float
Check the API docs when you get an error like this
Issue with String initializer in Swift when using format parameter
Foundation has to be imported before using String.
String is a Foundation type in swift.
I was corrected by Matt below in the comments.
String is built into swift and String(format: is in foundation.
I guess that's why the docs didn't show it as such.
Thank you for the correction.
arc4random and arm64 with swift
The parameters for UIColor(hue: saturation: brightness: alpha)
are all of type CGFloat
. The problem you're seeing is that CGFloat
is aliased to different types depending on the architecture. For 32-bit ARM, (iPhone 4S and 5), it's a Float
internally, but for arm64, it's actually a Double
. If you just use CGFloat
instead of Float
for your type casts it will work fine:
var hue = CGFloat(arc4random() % 256) / 256.0;
var saturation = CGFloat(arc4random() % 128) / 256.0 + 0.5;
var brightness = CGFloat(arc4random() % 128) / 256.0 + 0.5;
self.color = UIColor(hue: hue, saturation: saturation, brightness: brightness, alpha: 1);
Swift - Could not find an overload... & Cannot convert the expression's type
For both errors, try creating new CGFloat
s.
CGFloat(0.5)
CFFloat(scaledWidth)
How to use hex color values
#ffffff
are actually 3 color components in hexadecimal notation - red ff
, green ff
and blue ff
. You can write hexadecimal notation in Swift using 0x
prefix, e.g 0xFF
To simplify the conversion, let's create an initializer that takes integer (0 - 255) values:
extension UIColor {
convenience init(red: Int, green: Int, blue: Int) {
assert(red >= 0 && red <= 255, "Invalid red component")
assert(green >= 0 && green <= 255, "Invalid green component")
assert(blue >= 0 && blue <= 255, "Invalid blue component")
self.init(red: CGFloat(red) / 255.0, green: CGFloat(green) / 255.0, blue: CGFloat(blue) / 255.0, alpha: 1.0)
}
convenience init(rgb: Int) {
self.init(
red: (rgb >> 16) & 0xFF,
green: (rgb >> 8) & 0xFF,
blue: rgb & 0xFF
)
}
}
Usage:
let color = UIColor(red: 0xFF, green: 0xFF, blue: 0xFF)
let color2 = UIColor(rgb: 0xFFFFFF)
How to get alpha?
Depending on your use case, you can simply use the native UIColor.withAlphaComponent
method, e.g.
let semitransparentBlack = UIColor(rgb: 0x000000).withAlphaComponent(0.5)
Or you can add an additional (optional) parameter to the above methods:
convenience init(red: Int, green: Int, blue: Int, a: CGFloat = 1.0) {
self.init(
red: CGFloat(red) / 255.0,
green: CGFloat(green) / 255.0,
blue: CGFloat(blue) / 255.0,
alpha: a
)
}
convenience init(rgb: Int, a: CGFloat = 1.0) {
self.init(
red: (rgb >> 16) & 0xFF,
green: (rgb >> 8) & 0xFF,
blue: rgb & 0xFF,
a: a
)
}
(we cannot name the parameter alpha
because of a name collision with the existing initializer).
Called as:
let color = UIColor(red: 0xFF, green: 0xFF, blue: 0xFF, a: 0.5)
let color2 = UIColor(rgb: 0xFFFFFF, a: 0.5)
To get the alpha as an integer 0-255, we can
convenience init(red: Int, green: Int, blue: Int, a: Int = 0xFF) {
self.init(
red: CGFloat(red) / 255.0,
green: CGFloat(green) / 255.0,
blue: CGFloat(blue) / 255.0,
alpha: CGFloat(a) / 255.0
)
}
// let's suppose alpha is the first component (ARGB)
convenience init(argb: Int) {
self.init(
red: (argb >> 16) & 0xFF,
green: (argb >> 8) & 0xFF,
blue: argb & 0xFF,
a: (argb >> 24) & 0xFF
)
}
Called as
let color = UIColor(red: 0xFF, green: 0xFF, blue: 0xFF, a: 0xFF)
let color2 = UIColor(argb: 0xFFFFFFFF)
Or a combination of the previous methods. There is absolutely no need to use strings.
Related Topics
How to Get Commoncrypto/Commoncrypto File From
Maximum Height of iOS 8 Today Extension
Rotating a View in Layoutsubviews
How to Create a String with Format
How to Change the Color of the Text in a Uipickerview Under iOS 7
Find an Item and Change Value in Custom Object Array - Swift
How to Localize the Images in Images.Xcassets
How to Delete All Objects from My Persistent Store in Core Data
How to Detect Whether I Have iPhone 2G,3G,3Gs
Firebase .Indexon Dynamic Keys
Disabling Nslog for Production in Swift Project
How to Know That If the Only Visible Area of a .Png Is Touched in Xcode
Cocoa Touch: How to Change Uiview's Border Color and Thickness
Xcode Changes Unmodified Storyboard and Xib Files
How to Dismiss Keyboard iOS Programmatically When Pressing Return