Unexpected Behaviour with Argument Defaults

Unexpected behaviour with argument defaults

Default arguments are evaluated inside the scope of the function. Your f2 is similar (almost equivalent) to the following code:

f2 = function(y) {
if (missing(y)) y = y
y^2
}

This makes the scoping clearer and explains why your code doesn’t work.

Note that this is only true for default arguments; arguments that are explicitly passed are (of course) evaluated in the scope of the caller.

Lazy evaluation, on the other hand, has nothing to do with this: all arguments are lazily evaluated, but calling f2(y) works without complaint. To show that lazy evaluation always happens, consider this:

f3 = function (x) {
message("x has not been evaluated yet")
x
}

f3(message("NOW x has been evaluated")

This will print, in this order:

x has not been evaluated yet
NOW x has been evaluated

Python functions with default parameters show unexpected behaviour

The default value is an expression, which is evaluated only once at the moment the function is defined/compiled. It is probably stored somewhere in the function-object, so when this expression evaluates to a mutable object like a list, you get the effect you described. I don't know about the rational for this, but it is a feature of python.

In [11]: def f(x = [], y = 123):
...: pass

In [12]: f.func_defaults
Out[12]: ([], 123)

virtual function default arguments behaviour

Default arguments are entirely compile-time feature. I.e. the substitution of default arguments in place of missing arguments is performed at compile time. For this reason, obviously, there's no way default argument selection for member functions can depend on the dynamic (i.e. run-time) type of the object. It always depends on static (i.e. compile-time) type of the object.

The call you wrote in your code sample is immediately interpreted by the compiler as bp->print(10) regardless of anything else.

Python - unexpected behaviour when using a named constructor argument with empty list default value

Because the default argument 'alist=[]', creates only one list, once, when the module is read. This single list becomes the default argument for that __init__", and is shared by all your Thingys.

Try using None as a dummy symbol meaning "make a new empty list here". E.g.

def __init__(self, alist=None):
super(ThingyWithAList, self).__init__()
self.thelist = [] if alist is None else alist

Recursive default argument reference

And when I call fun() without any parameters the local variable a becomes a copy of the global variable a

No: default arguments are evaluated inside the scope of the function. Your code is similar to the following code:

fun = function(a) {
if (missing(a)) a = a
a + 1
}

This makes the scoping clearer and explains why your code doesn’t work.

Note that this is only true for default arguments; arguments that are explicitly passed are (of course) evaluated in the scope of the caller.

Least Astonishment and the Mutable Default Argument

Actually, this is not a design flaw, and it is not because of internals or performance. It comes simply from the fact that functions in Python are first-class objects, and not only a piece of code.

As soon as you think of it this way, then it completely makes sense: a function is an object being evaluated on its definition; default parameters are kind of "member data" and therefore their state may change from one call to the other - exactly as in any other object.

In any case, the effbot (Fredrik Lundh) has a very nice explanation of the reasons for this behavior in Default Parameter Values in Python.
I found it very clear, and I really suggest reading it for a better knowledge of how function objects work.

Python Default Argument List -- Inconsistent behavior

Modify your print statements to also print id(L):

def j(L=[]):
print('j, before if:', L, id(L))
if L == []:
L = []
print('j, after if: ', L, id(L))
L.append(5)
return L

Now check your results:

>>> j()
j, before if: [] 2844163925576
j, after if: [] 2844163967688
[5]

Note the difference in the IDs. By the time you get to the portion of the function where you modify L, it no longer refers to the same object as the default argument. You've rebound L to a new list (with the L = [] in the body of the if statement), and as a result, you are never changing the default argument.

Surprising behaviour when using functools.partial in default argument inside a class

functools.partial is not involved in this behavior.

This is because the default arg is created at the function definition, during the class definition, which occur in the scope of the class, so you don't need any "self.".

This is why this works :

class Example:
def A(self):
pass

def B(self, x=print(A, 1)):
print(self.A, 2)

print(A, 3)

Output:

<function Example.A at 0x7fcf8ebc5950> 1
<function Example.A at 0x7fcf8ebc5950> 3

Even without calling any Example.B(), it should print the <function Example.A at 0x7fcf8ebc5950> 1 and 3 when the class is defined.

But you can't create a partial with a method of a non-yet created class

Or, maybe you can, but I am not aware of any way to do it.

I tried to fix you class:

class A:
@staticmethod
def _add(a, b):
return a + b

@classmethod
def f_add(cls): # valid
return cls._add

@classmethod
def lambda_add(cls): # valid
return lambda x, y: cls._add(x, y)

@staticmethod
def lambda_arg_add(l=lambda x, y: A._add(x, y)): # valid
return l

@classmethod
def partial_add(cls): # valid
return partial(cls._add)

@staticmethod
def partial_arg_add(f=partial(_add)): # TypeError: 'staticmethod' object is not callable
return f

But partial_arg_add will fail because _add is not callable yet when the partial is created :

class A:
@staticmethod
def _add(a, b):
return a + b

print(_add(1, 3)) # TypeError: 'staticmethod' object is not callable
class A:
@staticmethod
def _add(a, b):
return a + b

print(A._add(1, 3)) # 4


Related Topics



Leave a reply



Submit