Why May We Use "Internal Argument Labels" in Type Annotations of Closures, When They (Seemingly) Can Never Be Accessed

Why may we use internal argument labels in type annotations of closures, when they (seemingly) can never be accessed?

What you're observing is the ability to define "purely cosmetic" parameter labels for closure types. This was accepted as a part of SE-0111, as stated in the rationale:

In response to community feedback, the core team is accepting the proposal with a revision to allow “purely cosmetic” parameter labels in closure types for documentation (as outlined in the alternatives section).

The syntax for these cosmetic parameter labels changed after the proposal to require an argument label of _, in order to make it explicit that the cosmetic labels aren't used at the call-site. This was detailed in an additional commentary:

The specific revision requested by the core team to SE-0111 is that
all “cosmetic” labels should be required to include an API name of _.
For example, this would not be allowed:

var op : (lhs : Int, rhs : Int) -> Int

instead, it should be spelled as:

var op : (_ lhs : Int, _ rhs : Int) -> Int

Although really, in practice, this makes the cosmetic labels fairly useless, as they don't show up at the call-site or even in auto-completion – only at the actual declaration itself. Therefore their intent to be self-documenting is somewhat lost.

The Swift team are aware of this shortcoming, and will be looking to make a purely additive change post-Swift 3 in order to rectify the situation.

Here's the sketch that they proposed (again, in the additional commentary):

First, we extend declaration names for variables, properties, and parameters to allow parameter names as part of their declaration name. For example:

var op(lhs:,rhs:) : (Int, Int) -> Int    // variable or property.
x = op(lhs: 1, rhs: 2) // use of the variable or property.

// API name of parameter is “opToUse”, internal name is "op(lhs:,rhs:)”.
func foo(opToUse op(lhs:,rhs:) : (Int, Int) -> Int) {
x = op(lhs: 1, rhs: 2) // use of the parameter
}
foo(opToUse: +) // call of the function

This will restore the ability to express the idea of a closure
parameter that carries labels as part of its declaration, without
requiring parameter labels to be part of the type system (allowing,
e.g. the operator + to be passed into something that requires
parameter labels).

Second, extend the rules for function types to allow parameter API
labels if and only if they are used as the type of a declaration
that allows parameter labels, and interpret them as a sugar form for
providing those labels on the underlying declaration. This means that
the example above could be spelled as:

var op : (lhs: Int, rhs: Int) -> Int    // Nice declaration syntax
x = op(lhs: 1, rhs: 2) // Same as above

// API name of parameter is “opToUse”, internal name is "op(lhs:,rhs:)”.
func foo(opToUse op : (lhs: Int, rhs: Int) -> Int) {
x = op(lhs: 1, rhs: 2) // Same as above.
}
foo(opToUse: +) // Same as above.

This proposed solution quite nicely allows the labels to be used at the call-site, allowing for self-documenting parameter labels, while not complicating the type-system with them. Additionally (in most cases) it allows for the expressive syntax of writing the labels next to the parameter types of the closure – which we were used to doing pre-Swift 3.

Type '()' cannot conform to View (except it definitely is a View, no shenanigans this time)

Shenaniganically, you are trying to use ViewBuilder syntax in the trailing closure, but you didn't adorn content with the @ViewBuilder annotation. So Swift infers that the trailing closure returns () (also called Void).

Change the init declaration to mention @ViewBuilder:

struct CompatibilityPicker<blah blah blah>: View where blah blah blah {

init(
_ label : Label,
selection : SelectionValue,
@ViewBuilder content : @escaping () -> Content
// ^^^^^^^^^^^^
) {
blah blah blah

UPDATE

    @Binding private var _selection : SelectionValue

blah blah blah

init(_ label : Label, selection : SelectionValue, content : @escaping () -> Content) {
self.label = label
self._selection = selection
self.content = content
}

The _selection variable is wrapped by the Binding wrapper, which means that it is really a computed property. The stored property is named __selection (note the two underscores) and has type Binding<SelectionValue>. Because _selection is a computed property, init cannot mention it until all stored properties are initialized. Probably you should change init to take a Binding<SelectionValue> argument instead of a SelectionValue argument:

    init(
_ label : Label,
selection : Binding<SelectionValue>,
@ViewBuilder content : @escaping () -> Content
// ^^^^^^^^^^^^
) {
self.label = label
self.content = content
__selection = selection
}

UPDATE 2

I looked at your other question and your code here and I think I know what you mean by “it doesn't work with anything but Int”.

The problem as that, when you say Text("Easy").tag(0), Swift treats 0 as an Int. If your Picker's SelectionValue is, say, Int16, then indeed the Picker will not be able to use the 0 tag because the types don't match.

You can make your tag work with Picker by giving 0 the correct type. For example: Text("Easy").tag(0 as Int16).

However, my recommendation is that you stop mucking about with CompatibilityPicker. It is a symptom of primitive obsession. The idiomatic solution is to use an enum for your tags:

enum Difficulty: Hashable {
case easy
case medium
case hard
}

struct Demo1: View {
@State var difficulty: Difficulty = .easy

var body: some View {
Picker("Difficulty", selection: $difficulty) {
Text("Easy").tag(Difficulty.easy)
Text("Medium").tag(Difficulty.medium)
Text("Hard").tag(Difficulty.hard)
}
}
}

You could go even further and do this:

extension Difficulty: CaseIterable { }

extension Difficulty {
var stringKey: LocalizedStringKey {
switch self {
case .easy: return "Easy"
case .medium: return "Medium"
case .hard: return "Hard"
}
}
}

struct Demo2: View {
@State var difficulty: Difficulty = .easy

var body: some View {
Picker("Difficulty", selection: $difficulty) {
ForEach(Difficulty.allCases, id: \.self) {
Text($0.stringKey).tag($0)
}
}
}
}

How can this instance seemingly outlive its own parameter lifetime?

Despite your best intentions, your hint function may not have the effect you expect. But we have quite a bit of ground to cover before we can understand what's going on.


Let's begin with this:

fn ensure_equal<'z>(a: &'z (), b: &'z ()) {}

fn main() {
let a = ();
let b = ();
ensure_equal(&a, &b);
}

OK, so in main, we defining two variables, a and b. They have distinct lifetimes, by virtue of being introduced by distinct let statements. ensure_equal requires two references with the same lifetime. And yet, this code compiles. Why?

That's because, given 'a: 'b (read: 'a outlives 'b), &'a T is a subtype of &'b T.

Let's say the lifetime of a is 'a and the lifetime of b is 'b. It's a fact that 'a: 'b, because a is introduced first. On the call to ensure_equal, the arguments are typed &'a () and &'b (), respectively1. There's a type mismatch here, because 'a and 'b are not the same lifetime. But the compiler doesn't give up yet! It knows that &'a () is a subtype of &'b (). In other words, a &'a () is a &'b (). The compiler will therefore coerce the expression &a to type &'b (), so that both arguments are typed &'b (). This resolves the type mismatch.

If you're confused by the application of "subtypes" with lifetimes, then let me rephrase this example in Java terms. Let's replace &'a () with Programmer and &'b () with Person. Now let's say that Programmer is derived from Person: Programmer is therefore a subtype of Person. That means that we can take a variable of type Programmer and pass it as an argument to a function that expects a parameter of type Person. That's why the following code will successfully compile: the compiler will resolve T as Person for the call in main.

class Person {}
class Programmer extends Person {}

class Main {
private static <T> void ensureSameType(T a, T b) {}

public static void main(String[] args) {
Programmer a = null;
Person b = null;
ensureSameType(a, b);
}
}

Perhaps the non-intuitive aspect of this subtyping relation is that the longer lifetime is a subtype of the shorter lifetime. But think of it this way: in Java, it's safe to pretend that a Programmer is a Person, but you can't assume that a Person is a Programmer. Likewise, it's safe to pretend that a variable has a shorter lifetime, but you can't assume that a variable with some known lifetime actually has a longer lifetime. After all, the whole point of lifetimes in Rust is to ensure that you don't access objects beyond their actual lifetime.


Now, let's talk about variance. What's that?

Variance is a property that type constructors have with respect to their arguments. A type constructor in Rust is a generic type with unbound arguments. For instance Vec is a type constructor that takes a T and returns a Vec<T>. & and &mut are type constructors that take two inputs: a lifetime, and a type to point to.

Normally, you would expect all elements of a Vec<T> to have the same type (and we're not talking about trait objects here). But variance lets us cheat with that.

&'a T is covariant over 'a and T. That means that wherever we see &'a T in a type argument, we can substitute it with a subtype of &'a T. Let's see how it works out:

fn main() {
let a = ();
let b = ();
let v = vec![&a, &b];
}

We've already established that a and b have different lifetimes, and that the expressions &a and &b don't have the same type1. So why can we make a Vec out of these? The reasoning is the same as above, so I'll summarize: &a is coerced to &'b (), so that the type of v is Vec<&'b ()>.


fn(T) is a special case in Rust when it comes to variance. fn(T) is contravariant over T. Let's build a Vec of functions!

fn foo(_: &'static ()) {}
fn bar<'a>(_: &'a ()) {}

fn quux<'a>() {
let v = vec![
foo as fn(&'static ()),
bar as fn(&'a ()),
];
}

fn main() {
quux();
}

This compiles. But what's the type of v in quux? Is it Vec<fn(&'static ())> or Vec<fn(&'a ())>?

I'll give you a hint:

fn foo(_: &'static ()) {}
fn bar<'a>(_: &'a ()) {}

fn quux<'a>(a: &'a ()) {
let v = vec![
foo as fn(&'static ()),
bar as fn(&'a ()),
];
v[0](a);
}

fn main() {
quux(&());
}

This doesn't compile. Here are the compiler messages:

error[E0495]: cannot infer an appropriate lifetime due to conflicting requirements
--> <anon>:5:13
|
5 | let v = vec![
| _____________^ starting here...
6 | | foo as fn(&'static ()),
7 | | bar as fn(&'a ()),
8 | | ];
| |_____^ ...ending here
|
note: first, the lifetime cannot outlive the lifetime 'a as defined on the body at 4:23...
--> <anon>:4:24
|
4 | fn quux<'a>(a: &'a ()) {
| ________________________^ starting here...
5 | | let v = vec![
6 | | foo as fn(&'static ()),
7 | | bar as fn(&'a ()),
8 | | ];
9 | | v[0](a);
10| | }
| |_^ ...ending here
note: ...so that reference does not outlive borrowed content
--> <anon>:9:10
|
9 | v[0](a);
| ^
= note: but, the lifetime must be valid for the static lifetime...
note: ...so that types are compatible (expected fn(&()), found fn(&'static ()))
--> <anon>:5:13
|
5 | let v = vec![
| _____________^ starting here...
6 | | foo as fn(&'static ()),
7 | | bar as fn(&'a ()),
8 | | ];
| |_____^ ...ending here
= note: this error originates in a macro outside of the current crate

error: aborting due to previous error

We're trying to call one of the functions in the vector with a &'a () argument. But v[0] expects a &'static (), and there's no guarantee that 'a is 'static, so this is invalid. We can therefore conclude that the type of v is Vec<fn(&'static ())>. As you can see, contravariance is the opposite of covariance: we can replace a short lifetime with a longer one.


Whew, now back to your question. First, let's see what the compiler makes out of the call to hint. hint has the following signature:

fn hint<'a, Arg>(_: &'a Arg) -> Foo<'a>

Foo is contravariant over 'a because Foo wraps a fn (or rather, pretends to, thanks to the PhantomData, but that doesn't make a difference when we talk about variance; both have the same effect), fn(T) is contravariant over T and that T here is &'a ().

When the compiler tries to resolve the call to hint, it only considers shortlived's lifetime. Therefore, hint returns a Foo with shortlived's lifetime. But when we try to assign that to the variable foo, we have a problem: a lifetime parameter on a type always outlives the type itself, and shortlived's lifetime doesn't outlive foo's lifetime, so clearly, we can't use that type for foo. If Foo was covariant over 'a, that would be the end of it and you'd get an error. But Foo is contravariant over 'a, so we can replace shortlived's lifetime with a larger lifetime. That lifetime can be any lifetime that outlives foo's lifetime. Note that "outlives" is not the same as "strictly outlives": the difference is that 'a: 'a ('a outlives 'a) is true, but 'a strictly outlives 'a is false (i.e. a lifetime is said to outlive itself, but it doesn't strictly outlive itself). Therefore, we might end up with foo having type Foo<'a> where 'a is exactly the lifetime of foo itself.

Now let's look at check(&foo, &outlived); (that's the second one). This one compiles because &outlived is coerced so that the lifetime is shortened to match foo's lifetime. That's valid because outlived has a longer lifetime than foo, and check's second argument is covariant over 'a because it's a reference.

Why doesn't check(&foo, &shortlived); compile? foo has a longer lifetime than &shortlived. check's second argument is covariant over 'a, but its first argument is contravariant over 'a, because Foo<'a> is contravariant. That is, both arguments are trying to pull 'a in opposite directions for this call: &foo is trying to enlarge &shortlived's lifetime (which is illegal), while &shortlived is trying to shorten &foo's lifetime (which is also illegal). There is no lifetime that will unify these two variables, therefore the call is invalid.


1 That might actually be a simplification. I believe that the lifetime parameter of a reference actually represents the region in which the borrow is active, rather than the lifetime of the reference. In this example, both borrows would be active for the statement that contains the call to ensure_equal, so they would have the same type. But if you split the borrows to separate let statements, the code still works, so the explanation is still valid. That said, for a borrow to be valid, the referent must outlive the borrow's region, so when I'm thinking of lifetime parameters, I only care about the referent's lifetime and I consider borrows separately.

how to fix groovy.lang.MissingMethodException: No signature of method:

Because you are passing three arguments to a four arguments method. Also, you are not using the passed closure.

If you want to specify the operations to be made on top of the source contents, then use a closure. It would be something like this:

def copyAndReplaceText(source, dest, closure){
dest.write(closure( source.text ))
}

// And you can keep your usage as:
copyAndReplaceText(source, dest){
it.replaceAll('Visa', 'Passport!!!!')
}

If you will always swap strings, pass both, as your method signature already states:

def copyAndReplaceText(source, dest, targetText, replaceText){
dest.write(source.text.replaceAll(targetText, replaceText))
}

copyAndReplaceText(source, dest, 'Visa', 'Passport!!!!')

What is the difference between a 'closure' and a 'lambda'?

A lambda is just an anonymous function - a function defined with no name. In some languages, such as Scheme, they are equivalent to named functions. In fact, the function definition is re-written as binding a lambda to a variable internally. In other languages, like Python, there are some (rather needless) distinctions between them, but they behave the same way otherwise.

A closure is any function which closes over the environment in which it was defined. This means that it can access variables not in its parameter list. Examples:

def func(): return h
def anotherfunc(h):
return func()

This will cause an error, because func does not close over the environment in anotherfunc - h is undefined. func only closes over the global environment. This will work:

def anotherfunc(h):
def func(): return h
return func()

Because here, func is defined in anotherfunc, and in python 2.3 and greater (or some number like this) when they almost got closures correct (mutation still doesn't work), this means that it closes over anotherfunc's environment and can access variables inside of it. In Python 3.1+, mutation works too when using the nonlocal keyword.

Another important point - func will continue to close over anotherfunc's environment even when it's no longer being evaluated in anotherfunc. This code will also work:

def anotherfunc(h):
def func(): return h
return func

print anotherfunc(10)()

This will print 10.

This, as you notice, has nothing to do with lambdas - they are two different (although related) concepts.

How to create an instance of an annotation

Well, it's apparently nothing all that complicated. Really!

As pointed out by a colleague, you can simply create an anonymous instance of the annotation (like any interface) like this:

MyAnnotation:

public @interface MyAnnotation
{

String foo();

}

Invoking code:

class MyApp
{
MyAnnotation getInstanceOfAnnotation(final String foo)
{
MyAnnotation annotation = new MyAnnotation()
{
@Override
public String foo()
{
return foo;
}

@Override
public Class<? extends Annotation> annotationType()
{
return MyAnnotation.class;
}
};

return annotation;
}
}

Credits to Martin Grigorov.



Related Topics



Leave a reply



Submit