Differences between functools.partial and a similar lambda?
A lambda function has the same type as a standard function, so it will behave like an instance method.
The
partial
object in your example can be called like this:g1(x, y, z)
leading to this call (not valid Python syntax, but you get the idea):
f(*secondary_args, x, y, z, **secondary_kwargs)
The lambda only accepts a single argument and uses a different argument order. (Of course both of these differences can be overcome – I'm just answering what the differences between the two versions you gave are.)
Execution of the
partial
object is slightly faster than execution of the equivalentlambda
.
How does functools partial do what it does?
Roughly, partial
does something like this (apart from keyword args support etc):
def partial(func, *part_args):
def wrapper(*extra_args):
args = list(part_args)
args.extend(extra_args)
return func(*args)
return wrapper
So, by calling partial(sum2, 4)
you create a new function (a callable, to be precise) that behaves like sum2
, but has one positional argument less. That missing argument is always substituted by 4
, so that partial(sum2, 4)(2) == sum2(4, 2)
As for why it's needed, there's a variety of cases. Just for one, suppose you have to pass a function somewhere where it's expected to have 2 arguments:
class EventNotifier(object):
def __init__(self):
self._listeners = []
def add_listener(self, callback):
''' callback should accept two positional arguments, event and params '''
self._listeners.append(callback)
# ...
def notify(self, event, *params):
for f in self._listeners:
f(event, params)
But a function you already have needs access to some third context
object to do its job:
def log_event(context, event, params):
context.log_event("Something happened %s, %s", event, params)
So, there are several solutions:
A custom object:
class Listener(object):
def __init__(self, context):
self._context = context
def __call__(self, event, params):
self._context.log_event("Something happened %s, %s", event, params)
notifier.add_listener(Listener(context))
Lambda:
log_listener = lambda event, params: log_event(context, event, params)
notifier.add_listener(log_listener)
With partials:
context = get_context() # whatever
notifier.add_listener(partial(log_event, context))
Of those three, partial
is the shortest and the fastest.
(For a more complex case you might want a custom object though).
functools.partial vs normal Python function
Functions do have information about what arguments they accept. The attribute names and data structures are aimed at the interpreter more than the developer, but the info is there. You'd have to introspect the .func_defaults
and .func_code.co_varnames
structures (and a few more besides) to find those details.
Using the inspect.getargspec()
function makes extracting the info a little more straightforward:
>>> import inspect
>>> h = lambda x : int(x,base=2)
>>> inspect.getargspec(h)
ArgSpec(args=['x'], varargs=None, keywords=None, defaults=None)
Note that a lambda
produces the exact same object type as a def funcname():
function statement produces.
What this doesn't give you, is what arguments are going to be passed into the wrapped function. That's because a function has a more generic use, while functools.partial()
is specialised and thus can easily provide you with that information. As such, the partial
tells you that base=2
will be passed into int()
, but the lambda
can only tell you it receives an argument x
.
So, while a functools.partial()
object can tell you what arguments are going to be passed into what function, a function
object can only tell you what arguments it receives, as it is the job of the (potentially much more complex) expressions that make up the function body to do the call to a wrapped function. And that is ignoring all those functions that don't call other functions at all.
Python functools partial efficiency
Why do the calls to the partial functions take longer?
The code with partial
takes about two times longer because of the additional function call. Function calls are expensive:
Function call overhead in Python is relatively high, especially compared with the execution speed of a builtin function.
-
Is the partial function just forwarding the parameters to the original function or is it mapping the static arguments throughout?
As far as i know - yes, it just forwards the arguments to the original function.
-
And also, is there a function in Python to return the body of a function filled in given that all the parameters are predefined, like with function i?
No, i am not aware of such built-in function in Python. But i think it's possible to do what you want, as functions are objects which can be copied and modified.
Here is a prototype:
import timeit
import types
# http://stackoverflow.com/questions/6527633/how-can-i-make-a-deepcopy-of-a-function-in-python
def copy_func(f, name=None):
return types.FunctionType(f.func_code, f.func_globals, name or f.func_name,
f.func_defaults, f.func_closure)
def f(a, b, c):
return a + b + c
i = copy_func(f, 'i')
i.func_defaults = (4, 5, 3)
print timeit.timeit('f(4,5,3)', setup = 'from __main__ import f', number=100000)
print timeit.timeit('i()', setup = 'from __main__ import i', number=100000)
which gives:
0.0257439613342
0.0221881866455
Lambda or functools.partial for deferred function evaluation?
Lambdas don't store data which is not in its argument. This could lead to strange behaviors:
def lambdas(*args):
for arg in args:
yield lambda: str(arg)
one, two, three, four = lambdas(1, 2, 3, 4)
print one(), two(), three(), four()
Expected output
1 2 3 4
Output
4 4 4 4
This happens because the lambda didn't store the arg
value and it went through all the elements of args
, so now arg
is always 4
.
The preferred way is to use functools.partial
, where you are forced to store the arguments:
from functools import partial
def partials(*args):
for arg in args:
yield partial(str, arg)
one, two, three, four = partials(1, 2, 3, 4)
print one(), two(), three(), four()
Expected output
1 2 3 4
Output
1 2 3 4
Are these `.` attribute bindings necessary in the implementation of `functools.partial`?
I think you can look at the partial
class implementation to help you understand better.
The following (Python 3.9.5
)
class partial:
"""New function with partial application of the given arguments
and keywords.
"""
__slots__ = "func", "args", "keywords", "__dict__", "__weakref__"
def __new__(cls, func, /, *args, **keywords):
if not callable(func):
raise TypeError("the first argument must be callable")
if hasattr(func, "func"):
args = func.args + args
keywords = {**func.keywords, **keywords}
func = func.func
self = super(partial, cls).__new__(cls)
self.func = func
self.args = args
self.keywords = keywords
return self
def __call__(self, /, *args, **keywords):
keywords = {**self.keywords, **keywords}
return self.func(*self.args, *args, **keywords)
...
When you replace self
with newfunc
, they're pretty much the same.
Performant way to partially apply in Python?
The first way seems to be the most efficient. I tweaked your code so that all 4 functions compute exactly the same mathematical function:
import functools,timeit
def multiplier(m):
def inner(x):
return m * x
return inner
def mult(x,m):
return m*x
def multer(m):
return functools.partial(mult,m=m)
f1 = multiplier(2)
f2 = multer(2)
f3 = functools.partial(mult,m=2)
f4 = lambda x: mult(x,2)
print(timeit.timeit('f1(10)',setup = 'from __main__ import f1'))
print(timeit.timeit('f2(10)',setup = 'from __main__ import f2'))
print(timeit.timeit('f3(10)',setup = 'from __main__ import f3'))
print(timeit.timeit('f4(10)',setup = 'from __main__ import f4'))
Typical output (on my machine):
0.08207898699999999
0.19439769299999998
0.20093803199999993
0.1442435820000001
The two functools.partial
approaches are identical (since one of them is just a wrapper for the other), the first is twice as fast, and the last is somewhere in between (but closer to the first). There is a clear overhead in using functools
over a straightforward closure. Since the closure approach is arguably more readable as well (and more flexible than the lambda which doesn't extend well to more complicated functions) I would just go with it.
a simplified signature in functools.partial
The signature means how the function is called, if you have a function that needs five strings its signature is
foo (st1, st2, st3, st4, st5)
If you now use partial to freeze three of them, it only needs two
foo (st1, st2)
Which is "simplified" because it needs less parameters.
Hope it helps.
Related Topics
Why Is Python 3.X's Super() Magic
Python App Does Not Print Anything When Running Detached in Docker
Python Pandas Insert List into a Cell
Modifying a Python Dict While Iterating Over It
Why Does Checking a Variable Against Multiple Values with 'Or' Only Check the First Value
Why Does List.Append() Return None
How to Highlight Text in a Tkinter Text Widget
Index a 2D Numpy Array with 2 Lists of Indices
Importerror: Dll Load Failed: the Specified Module Could Not Be Found
Python Append() VS. + Operator on Lists, Why Do These Give Different Results
How to Convert String to Binary
Pandas: Multiple Conditions While Indexing Data Frame - Unexpected Behavior
How to Turn a Python Datetime into a String, with Readable Format Date