How to Add Custom Init for String Extension

How to add custom init for String extension?

Edit: Don't format currencies yourself.

However you might think currencies are formatted, you're almost certainly wrong. Just compare:

  • US/Canada: $3,490,000.89
  • French Canadian: 3 490 000,89 $
  • France: 3 490 000,89 €
  • Germany: 3.490.000,89 €

Instead, use NumberFormatter with numberStyle set to .currency, with a specified locale.

let currencyFormatter = NumberFormatter()
currencyFormatter.usesGroupingSeparator = true
currencyFormatter.numberStyle = .currency
currencyFormatter.locale = Locale.current
let priceString = currencyFormatter.string(from: 9999.99)!
print(priceString) // Displays $9,999.99 in the US locale

Original answer:

The initializers (and mutating methods) of value types can simply assign directly to self:

import Foundation

extension String {
init(_ amount: Double, decimalPlaces: UInt) {
let currencyAmount = String(format: "%\(decimalPlaces).f", amount)
let currencySign = NSLocalizedString("Defaults.CurrencySign", comment: "currency sign")
self = "\(currencySign)\(currencyAmount)"
}
}

let price = Double(155.15)
let formattedPrice = String(price, decimalPlaces: 2) // formattedPrice = "$155.15"

How to add initializers in extensions to existing UIKit classes such as UIColor?

You can't do it like this, you have to chose different parameter names to create your own initializers/ You can also make then generic to accept any BinaryInteger or BinaryFloatingPoint types:

extension UIColor {
convenience init<T: BinaryInteger>(r: T, g: T, b: T, a: T = 255) {
self.init(red: .init(r)/255, green: .init(g)/255, blue: .init(b)/255, alpha: .init(a)/255)
}
convenience init<T: BinaryFloatingPoint>(r: T, g: T, b: T, a: T = 1.0) {
self.init(red: .init(r), green: .init(g), blue: .init(b), alpha: .init(a))
}
}


let green1 = UIColor(r: 0, g: 255, b: 0, a: 255)  // r 0,0 g 1,0 b 0,0 a 1,0
let green2 = UIColor(r: 0, g: 1.0, b: 0, a: 1.0) // r 0,0 g 1,0 b 0,0 a 1,0

let red1 = UIColor(r: 255, g: 0, b: 0) // r 1,0 g 0,0 b 0,0 a 1,0
let red2 = UIColor(r: 1.0, g: 0, b: 0) // r 1,0 g 0,0 b 0,0 a 1,0

Custom Section initializer in SwiftUI

Here is a solution (tested with Xcode 12.1 / iOS 14.1)

extension Section where Parent == Text, Content: View, Footer == EmptyView {
init(_ title: String, content: @escaping () -> Content) {
self.init(header: Text(title), content: content)
}
}

On GitHub

Adding failing optional initializer in Swift in extension

Swift initialisation is one of the most confusing part of the language, especially if you're coming from Objective-C. The short answer is that:

Unlike Objective-C initializers, Swift initializers do not return a value.

(from https://developer.apple.com/library/content/documentation/Swift/Conceptual/Swift_Programming_Language/Initialization.html).

Therefore when you write:

public convenience init?(
name fontName: String, size fontSize: CGFloat, weight fontWeight: String
) {
let font = self.init(name: fontName + "-" + fontWeight, size: fontSize)

the type of font is Void. Yeah, I know, it does raise an eyebrow, since it's different model than in ObjC. In Swift you delegate the work of initialisation to another initialiser, but you're not managing the instance directly. So you can call it a missing feature.

The easy workaround is to make a factory method instead of initialiser:

public static func from(
name fontName: String, size fontSize: CGFloat, weight fontWeight: String
) -> UIFont? {
if let font = self.init(name: fontName + "-" + fontWeight, size: fontSize) {
return font
}
else if let font = self.init(name: fontName, size: fontSize) {
return font
}
else {
//
// Add some failback code
//
return nil
}
}

How do I write a custom init for a UIView subclass in Swift?

The init(frame:) version is the default initializer. You must call it only after initializing your instance variables. If this view is being reconstituted from a Nib then your custom initializer will not be called, and instead the init?(coder:) version will be called. Since Swift now requires an implementation of the required init?(coder:), I have updated the example below and changed the let variable declarations to var and optional. In this case, you would initialize them in awakeFromNib() or at some later time.

class TestView : UIView {
var s: String?
var i: Int?
init(s: String, i: Int) {
self.s = s
self.i = i
super.init(frame: CGRect(x: 0, y: 0, width: 100, height: 100))
}

required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
}
}

String initializer for optional primitive types: Int? Double? Float? etc

Do you mean something like this

extension String {
init?<T : CustomStringConvertible>(_ value : T?) {
guard let value = value else { return nil }
self.init(describing: value)
}
}

or

extension String {
init?<T : LosslessStringConvertible>(_ value : T?) {
guard let value = value else { return nil }
self.init(value)
}
}

Array initializer for custom types

Swift 3 example with custom Type :

Let assume that you have MyCustomType as a custom class:

class MyCustomType
{
var name = ""
var code = ""

convenience init(name: String, code: String)
{
self.init()

self.name = name
self.code = code
}
}

You can extend the array this way:

extension Collection where Iterator.Element == MyCustomType
{
func getNameFrom(code: String) -> String?
{
for custom in self {
if custom.code == code {
return custom.name
}
}
return nil
}
}

usage:

let myCustomArray = [MyCustomType]()

myCustomArray.getNameFrom(code: "code")

Hope it's help! :)



Related Topics



Leave a reply



Submit