How to Pass Int.Init to a Function in Swift

Is it possible to pass Int.init to a function in Swift?

The problem is that there doesn't exist an Int initialiser that has only a String parameter.

There only exists:

init?(_ text: String, radix: Int = default)

Although the radix: parameter has a default value, this is only "filled in" for you by the compiler at the call-site of the initialiser.

Default parameter values aren't partially applied when getting a reference to the function itself, nor does the compiler generate additional overloads for all possible combinations of default parameter values. Not only would the latter add significant bloat to your code, but both options would break default parameter values that depend on being inserted at the call-site (e.g #file & #line).

To make this work properly, function values themselves would have to support default parameter values, which brings with it quite a bit of complexity. Although it could well be something that a future version of the language supports.

One simple solution to get the behaviour you want is to just write an extension on Int which adds an init(_:) initialiser that then forwards onto init(_:radix:):

extension Int {
/// Creates a new integer value from the given base-10 integer string, returning nil
/// if the string is in an invalid format.
init?(_ string: String) {
// we're saying (_:radix:) in order to disambiguate the call, the compiler
// will insert the default parameter value of radix: for us.
self.init(_:radix:)(string)
}
}

Note in Swift 4, this could be an extension of FixedWidthInteger, which provides the implementation of init(_:radix:) for the standard library's integer types.

Now you can say:

let x = ["10", "20", "30"]
let y = x.map(Int.init) // [Optional(10), Optional(20), Optional(30)]

as Int.init now refers to our extension initialiser.

And this also works just as well with flatMap(_:):

let y = x.flatMap(Int.init) // [10, 20, 30]

which will also filter out the nil elements from the transformation (the strings that weren't convertible to integers).

How to initialize and use function pointer in Swift

To assign the closure to the property you have to remove the parentheses

self.funcPointer = self.func1

The subsequent error

self' used in method call 'func1' before all stored properties are initialized

can be fixed by declaring funcPointer implicit unwrapped optional

internal final var funcPointer: (() -> Void)!

Protocol and init argument

Since parameter has to be of type conforming to EnemyMovement, including these methods, you have to pass this object. So, you can try to create example struct

struct Movements: EnemyMovement {

func forward(speedPercent: Int) {
print(speedPercent)
}

func reverse(speedPercent: Int) {
print(speedPercent)
}

func left(speedPercent: Int) {
print(speedPercent)
}

func right(speedPercent: Int) {
print(speedPercent)
}

}

now as parameter for EnemyInstance initializer pass new instance of Movements

var enemy = EnemyInstance(name: "zombie", enemyMovement: Movements())

then you can call some method on enemyMovement property of your class and code inside this certain method gets executed (in this case it should print speedPercent)

required init (name: String, enemyMovement: EnemyMovement) {
self.name = name
self.enemyMovement = enemyMovement
enemyMovement.forward(speedPercent: 2) // prints 2
}

Pass self as argument within init method in Swift 1.2

The Problem

The problem is your use of let - optionals declared as let aren't given a default value of nil (var is however). The following, introduced in Swift 1.2, wouldn't be valid otherwise since you wouldn't be able to give myOptional a value after declaring it:

let myOptional: Int?

if myCondition {
myOptional = 1
} else {
myOptional = nil
}

Therefore, you're getting the error 'Property 'self.panGestureRecognizer' not initialized at super.init call' because before calling super.init(coder: aDecoder), because panGestureRecognizer isn't nil; it hasn't been initialised at all.

The Solutions:

1. Declare panGestureRecognizer as a var, meaning it will be given a default value of nil, which you could then change after calling super.init(coder: aDecoder).

2. In my opinion, the better solution: don't use an implicitly unwrapped optional and declare panGestureRecognizer with an initial value of UIPanGestureRecognizer(). Then set the target after super.init is called:

class SubView: UIView {
let panGestureRecognizer = UIPanGestureRecognizer()

required init(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)

panGestureRecognizer.addTarget(self, action: Selector("panAction:"))
}
}

How to send argument when init UIView

To fulfill all rules and requirements you have to write

class MyView : UIView {

var number : Int

override convenience init(frame: CGRect)
{
self.init(frame:frame, number: 0)
}

init(frame: CGRect, number: Int)
{
self.number = number
super.init(frame: frame)
}

required init?(coder aDecoder: NSCoder) {
number = 0
super.init(coder : aDecoder)
}
}

It might be easier to initialize the view the usual / designated way and set the property in an extra line.

Using init() in map()

Simplified repro:

String.init as Character -> String
// error: type of expression is ambiguous without more context

This is because String has two initializers that accept one Character:

init(_ c: Character)
init(stringInterpolationSegment expr: Character)

As far as I know, there is no way to disambiguate them when using the initializer as a value.

As for (1...100).map(String.init), String.init is referred as Int -> String. Although there are two initializers that accept one Int:

init(stringInterpolationSegment expr: Int)
init<T : _SignedIntegerType>(_ v: T)

Generic type is weaker than explicit type. So the compiler choose stringInterpolationSegment: one in this case. You can confirm that by command + click on .init.



Related Topics



Leave a reply



Submit