How to Require That a Protocol Can Only Be Adopted by a Specific Class

How to require that a protocol can only be adopted by a specific class

protocol AddsMoreCommands: class {
// Code
}

extension AddsMoreCommands where Self: UIViewController {
// Code
}

Swift protocol to only implemented by specific classes

You could add a required method that you only extend for the appropriate classes.

for example:

protocol PeopleProtocol
{
var conformsToPeopleProtocol:Bool { get }
}

extension PeopleProtocol where Self:People
{
var conformsToPeopleProtocol:Bool {return true}
}

class People
{}

class Neighbours:People
{}

extension Neighbours:PeopleProtocol // this works
{}

class Doctors:People,PeopleProtocol // this also works
{}

class Dogs:PeopleProtocol // this will not compile
{}

This could easily be circumvented by a programmer who would want to, but at least it will let the compiler warn you if you try to apply the protocol to other classes.

How to make protocol conformed by only required classes?

I am not sure if below code is what you need, do let me know if that’s what you were looking for, else I will be happy to remove my answer.

protocol A:BaseViewController  {
func execute()
}

protocol B:A {
func confirm()
}

class BaseViewController: UIViewController {

}

class AnotherVC: B {

}

In above code compiler will give error saying-:

'A' requires that 'AnotherVC' inherit from ‘BaseViewController'

Once you inherit AnotherVC from BaseViewController, it will give another error saying-:

Type 'AnotherVC' does not conform to protocol ‘A'

Once you confirm the implementations errors will be resolved-:

class AnotherVC:BaseViewController, B {
func confirm() {

}

func execute() {

}

}

Adoption of Protocol in Swift

It's because the protocol X states that someA is of type A, so in class Y, if you made someA of type B, then you couldn't assign anything of type A to it, which the protocol says that you need to be able to do.

If the protocol said that you needed a variable to hold any Car, and you had a Porsche, so you just wanted to tell your protocol conforming class that the variable could only hold a Porsche, then someone who comes along and tries to put a Mazda into your Porsche variable would encounter an issue, since the protocol says they should be able to.

Class-Only Protocols in Swift

Swift 4 allows you to combine types, so you can have your protocol and then create, for example, a type alias to combine it with a specific class requirement.

For (a contrived) example:

typealias PresentableVC = UIViewController & Presentable

For the presented code:

The problem is that you're trying to limit to specific classes and Swift can't do that (at the moment anyway). You can only limit to classes and inherit from other protocols. Your syntax is for protocol inheritance but you're trying to use it as a class limitation.

Note that the purpose of class protocols is:

Use a class-only protocol when the behavior defined by that protocol’s requirements assumes or requires that a conforming type has reference semantics rather than value semantics.

Why do classes need to be final when adopting a protocol with a property with type Self ?

I suspect the problem you've encountered stems from the fact that read-write properties are invariant in Swift. If you consider a more basic example without Self:

class A {}
class B : A {}

class Foo {
var a = A()
}

class Bar : Foo {
// error: Cannot override mutable property 'a' of type 'A' with covariant type 'B'
override var a : B {
get {
return B() // can return B (as B conforms to A)
}
set {} // can only accept B (but not A, therefore illegal!)
}
}

We cannot override the read-write property of type A with a property of type B. This is because although reading is covariant, writing is contravariant. In other words, we can always make the getter return a more specific type from a given property as we override it – as that specific type will always conform/inherit from the base type of the property. However we cannot make the setter more type restrictive, as we're now preventing given types from being set that our original property declaration allowed to be set.

Therefore we reach this state where we cannot make the property any more or any less type specific as we override it, thus making it invariant.

This comes into play with a Self typed property requirement, as Self refers to the runtime type of whatever is implementing the protocol, therefore its static type will be the same type as the class declaration. Therefore when you come to subclass a given class with this protocol requirement, the static type of Self in that subclass is the type of the subclass – thus the property would need to be covariant in order to meet this requirement, which it isn't.

You may then be wondering why the compiler doesn't allow you to satisfy this protocol requirement with a read-only computed property, which is covariant. I suspect this is due to the fact that a read-only computed property can be overridden by a settable computed property, thus creating invariance. Although I'm not entirely sure why the compiler can't pick that error up at the point of overriding, rather than preventing read-only properties entirely.

In any case, you can achieve the same result through using a method instead:

protocol Bar {
func foo() -> Self
}

class Foo : Bar {

required init() {}

func foo() -> Self {
return self.dynamicType.init()
}
}

If you need to implement a property from a protocol with a Self requirement, then you will indeed have to make the class final in order to prevent subclassing and therefore violations of the protocol requirement for read-write properties.

This reason that this all works fine with method inputs is due to overloading. When you implement a method with a Self input type in a subclass, you're not overriding the one in the superclass – you're simply adding another method that can be called. When you come to call the method, Swift will favour the one with the more type specific signature (which you can see in your example).

(Note if you try and use the override keyword on your method, you'll get a compiler error.)

Also as you've discovered for when Self refers to a method input, you don't even have to implement a new method in your subclass to take account of the change in Self's type. This is due to the contravariance of method inputs, which I explain further in this answer. Basically the superclass method already satisfies the subclass requirement, as you can freely replace the type of a given method input with its supertype.



Related Topics



Leave a reply



Submit