How to Use Hex Color Values

How to use hex color values

#ffffff are actually 3 color components in hexadecimal notation - red ff, green ff and blue ff. You can write hexadecimal notation in Swift using 0x prefix, e.g 0xFF

To simplify the conversion, let's create an initializer that takes integer (0 - 255) values:

extension UIColor {
convenience init(red: Int, green: Int, blue: Int) {
assert(red >= 0 && red <= 255, "Invalid red component")
assert(green >= 0 && green <= 255, "Invalid green component")
assert(blue >= 0 && blue <= 255, "Invalid blue component")

self.init(red: CGFloat(red) / 255.0, green: CGFloat(green) / 255.0, blue: CGFloat(blue) / 255.0, alpha: 1.0)
}

convenience init(rgb: Int) {
self.init(
red: (rgb >> 16) & 0xFF,
green: (rgb >> 8) & 0xFF,
blue: rgb & 0xFF
)
}
}

Usage:

let color = UIColor(red: 0xFF, green: 0xFF, blue: 0xFF)
let color2 = UIColor(rgb: 0xFFFFFF)

How to get alpha?

Depending on your use case, you can simply use the native UIColor.withAlphaComponent method, e.g.

let semitransparentBlack = UIColor(rgb: 0x000000).withAlphaComponent(0.5)

Or you can add an additional (optional) parameter to the above methods:

convenience init(red: Int, green: Int, blue: Int, a: CGFloat = 1.0) {
self.init(
red: CGFloat(red) / 255.0,
green: CGFloat(green) / 255.0,
blue: CGFloat(blue) / 255.0,
alpha: a
)
}

convenience init(rgb: Int, a: CGFloat = 1.0) {
self.init(
red: (rgb >> 16) & 0xFF,
green: (rgb >> 8) & 0xFF,
blue: rgb & 0xFF,
a: a
)
}

(we cannot name the parameter alpha because of a name collision with the existing initializer).

Called as:

let color = UIColor(red: 0xFF, green: 0xFF, blue: 0xFF, a: 0.5)
let color2 = UIColor(rgb: 0xFFFFFF, a: 0.5)

To get the alpha as an integer 0-255, we can

convenience init(red: Int, green: Int, blue: Int, a: Int = 0xFF) {
self.init(
red: CGFloat(red) / 255.0,
green: CGFloat(green) / 255.0,
blue: CGFloat(blue) / 255.0,
alpha: CGFloat(a) / 255.0
)
}

// let's suppose alpha is the first component (ARGB)
convenience init(argb: Int) {
self.init(
red: (argb >> 16) & 0xFF,
green: (argb >> 8) & 0xFF,
blue: argb & 0xFF,
a: (argb >> 24) & 0xFF
)
}

Called as

let color = UIColor(red: 0xFF, green: 0xFF, blue: 0xFF, a: 0xFF)
let color2 = UIColor(argb: 0xFFFFFFFF)

Or a combination of the previous methods. There is absolutely no need to use strings.

How does hexadecimal color work?

Hexadecimal uses sixteen distinct symbols, in the case of css color the symbols 0–9 to represent values zero to nine (obviously), and A, B, C, D, E, F to represent values ten to fifteen. So, using one Hexadecimal character you can represent 16 values. With two Hexadecimal you can represent 256 (16*16) values.

In RGB you have colours represented by Red Green Blue (R=0-255, G=0-255, B=0-255), so we use 3 pairs of Hexadecimal symbols! So when you see an RGB color, you can make the calculation below.

Example:

Hex: #4C8ED5 is RGB: 76, 142, 213.

Because 4C = 76(Red), 8E = 142(Green), D5 = 213(Blue)!

Hope it helps your understanding!

More to read: Hexadecimal on Wikipedia and a nice RGB to Hexidecimal Converter

How do I use hexadecimal color strings in Flutter?

In Flutter, the Color class only accepts integers as parameters, or there is the possibility to use the named constructors fromARGB and fromRGBO.

So we only need to convert the string #b74093 to an integer value. Also we need to respect that opacity always needs to be specified.

255 (full) opacity is represented by the hexadecimal value FF. This already leaves us with 0xFF. Now, we just need to append our color string like this:

const color = const Color(0xffb74093); // Second `const` is optional in assignments.

The letters can by choice be capitalized or not:

const color = const Color(0xFFB74093);

If you want to use percentage opacity values, you can replace the first FF with the values from this table (also works for the other color channels).

Extension class

Starting with Dart 2.6.0, you can create an extension for the Color class that lets you use hexadecimal color strings to create a Color object:

extension HexColor on Color {
/// String is in the format "aabbcc" or "ffaabbcc" with an optional leading "#".
static Color fromHex(String hexString) {
final buffer = StringBuffer();
if (hexString.length == 6 || hexString.length == 7) buffer.write('ff');
buffer.write(hexString.replaceFirst('#', ''));
return Color(int.parse(buffer.toString(), radix: 16));
}

/// Prefixes a hash sign if [leadingHashSign] is set to `true` (default is `true`).
String toHex({bool leadingHashSign = true}) => '${leadingHashSign ? '#' : ''}'
'${alpha.toRadixString(16).padLeft(2, '0')}'
'${red.toRadixString(16).padLeft(2, '0')}'
'${green.toRadixString(16).padLeft(2, '0')}'
'${blue.toRadixString(16).padLeft(2, '0')}';
}

The fromHex method could also be declared in a mixin or class because the HexColor name needs to be explicitly specified in order to use it, but the extension is useful for the toHex method, which can be used implicitly. Here is an example:

void main() {
final Color color = HexColor.fromHex('#aabbcc');

print(color.toHex());
print(const Color(0xffaabbcc).toHex());
}

Disadvantage of using hex strings

Many of the other answers here show how you can dynamically create a Color from a hex string, like I did above. However, doing this means that the color cannot be a const.

Ideally, you would assign your colors the way I explained in the first part of this answer, which is more efficient when instantiating colors a lot, which is usually the case for Flutter widgets.

Use Hex color in SwiftUI

You're almost there, you were using the wrong initialiser parameter:

extension Color {
init(hex: String) {
let hex = hex.trimmingCharacters(in: CharacterSet.alphanumerics.inverted)
var int: UInt64 = 0
Scanner(string: hex).scanHexInt64(&int)
let a, r, g, b: UInt64
switch hex.count {
case 3: // RGB (12-bit)
(a, r, g, b) = (255, (int >> 8) * 17, (int >> 4 & 0xF) * 17, (int & 0xF) * 17)
case 6: // RGB (24-bit)
(a, r, g, b) = (255, int >> 16, int >> 8 & 0xFF, int & 0xFF)
case 8: // ARGB (32-bit)
(a, r, g, b) = (int >> 24, int >> 16 & 0xFF, int >> 8 & 0xFF, int & 0xFF)
default:
(a, r, g, b) = (1, 1, 1, 0)
}

self.init(
.sRGB,
red: Double(r) / 255,
green: Double(g) / 255,
blue: Double(b) / 255,
opacity: Double(a) / 255
)
}
}

How to use hex color values

#ffffff are actually 3 color components in hexadecimal notation - red ff, green ff and blue ff. You can write hexadecimal notation in Swift using 0x prefix, e.g 0xFF

To simplify the conversion, let's create an initializer that takes integer (0 - 255) values:

extension UIColor {
convenience init(red: Int, green: Int, blue: Int) {
assert(red >= 0 && red <= 255, "Invalid red component")
assert(green >= 0 && green <= 255, "Invalid green component")
assert(blue >= 0 && blue <= 255, "Invalid blue component")

self.init(red: CGFloat(red) / 255.0, green: CGFloat(green) / 255.0, blue: CGFloat(blue) / 255.0, alpha: 1.0)
}

convenience init(rgb: Int) {
self.init(
red: (rgb >> 16) & 0xFF,
green: (rgb >> 8) & 0xFF,
blue: rgb & 0xFF
)
}
}

Usage:

let color = UIColor(red: 0xFF, green: 0xFF, blue: 0xFF)
let color2 = UIColor(rgb: 0xFFFFFF)

How to get alpha?

Depending on your use case, you can simply use the native UIColor.withAlphaComponent method, e.g.

let semitransparentBlack = UIColor(rgb: 0x000000).withAlphaComponent(0.5)

Or you can add an additional (optional) parameter to the above methods:

convenience init(red: Int, green: Int, blue: Int, a: CGFloat = 1.0) {
self.init(
red: CGFloat(red) / 255.0,
green: CGFloat(green) / 255.0,
blue: CGFloat(blue) / 255.0,
alpha: a
)
}

convenience init(rgb: Int, a: CGFloat = 1.0) {
self.init(
red: (rgb >> 16) & 0xFF,
green: (rgb >> 8) & 0xFF,
blue: rgb & 0xFF,
a: a
)
}

(we cannot name the parameter alpha because of a name collision with the existing initializer).

Called as:

let color = UIColor(red: 0xFF, green: 0xFF, blue: 0xFF, a: 0.5)
let color2 = UIColor(rgb: 0xFFFFFF, a: 0.5)

To get the alpha as an integer 0-255, we can

convenience init(red: Int, green: Int, blue: Int, a: Int = 0xFF) {
self.init(
red: CGFloat(red) / 255.0,
green: CGFloat(green) / 255.0,
blue: CGFloat(blue) / 255.0,
alpha: CGFloat(a) / 255.0
)
}

// let's suppose alpha is the first component (ARGB)
convenience init(argb: Int) {
self.init(
red: (argb >> 16) & 0xFF,
green: (argb >> 8) & 0xFF,
blue: argb & 0xFF,
a: (argb >> 24) & 0xFF
)
}

Called as

let color = UIColor(red: 0xFF, green: 0xFF, blue: 0xFF, a: 0xFF)
let color2 = UIColor(argb: 0xFFFFFFFF)

Or a combination of the previous methods. There is absolutely no need to use strings.

How do I get the color from a hexadecimal color code using .NET?

I'm assuming that's an ARGB code... Are you referring to System.Drawing.Color or System.Windows.Media.Color? The latter is used in WPF for example. I haven't seen anyone mention it yet, so just in case you were looking for it:

using System.Windows.Media;

Color color = (Color)ColorConverter.ConvertFromString("#FFDFD991");

How to identify a given string is hex color format

Note: This is strictly for validation, i.e. accepting a valid hex color. For actual parsing you won't get the individual parts out of this.

^#(?:[0-9a-fA-F]{3}){1,2}$

For ARGB:

^#(?:[0-9a-fA-F]{3,4}){1,2}$

Dissection:

^              anchor for start of string
# the literal #
( start of group
?: indicate a non-capturing group that doesn't generate backreferences
[0-9a-fA-F] hexadecimal digit
{3} three times
) end of group
{1,2} repeat either once or twice
$ anchor for end of string

This will match an arbitrary hexadecimal color value that can be used in CSS, such as #91bf4a or #f13.



Related Topics



Leave a reply



Submit