What Are Good Uses for Python3's "Function Annotations"

What are good uses for Python3's Function Annotations?

I think this is actually great.

Coming from an academic background, I can tell you that annotations have proved themselves invaluable for enabling smart static analyzers for languages like Java. For instance, you could define semantics like state restrictions, threads that are allowed to access, architecture limitations, etc., and there are quite a few tools that can then read these and process them to provide assurances beyond what you get from the compilers. You could even write things that check preconditions/postconditions.

I feel something like this is especially needed in Python because of its weaker typing, but there were really no constructs that made this straightforward and part of the official syntax.

There are other uses for annotations beyond assurance. I can see how I could apply my Java-based tools to Python. For instance, I have a tool that lets you assign special warnings to methods, and gives you indications when you call them that you should read their documentation (E.g., imagine you have a method that must not be invoked with a negative value, but it's not intuitive from the name). With annotations, I could technically write something like this for Python. Similarly, a tool that organizes methods in a large class based on tags can be written if there is an official syntax.

What good are Python function annotations?

As you mentioned, the relevant PEP is 3107 (linked for easy reference in case others encountering this question haven't read it yet).

For now, annotations are kind of an experiment, and kind of a work in progress. There is actually a recent thread in the python-ideas mailing list on the topic which may be helpful. (The link provided is just for the monthly archive; I find that the URL for specific posts tends to change periodically. The thread in question is near the beginning of December, and titled "[Python-ideas] Conventions for function annotations". The first post is from Thomas Kluyver on Dec 1.)

Here's a bit from one of Guido van Rossum's posts in that thread:

On 12/4/2012 11:43 AM, Jasper St. Pierre wrote:

Indeed. I've looked at annotations before, but I never understood the
purpose. It seemed like a feature that was designed and implemented without
some goal in mind, and where the community was supposed to discover the goal
themselves.

Guido's response:

To the contrary. There were too many use cases that immediately looked
important, and we couldn't figure out which ones would be the most
important or how to combine them, so we decided to take a two-step
approach: in step 1, we designed the syntax, whereas in step 2, we
would design the semantics. The idea was very clear that once the
syntax was settled people would be free to experiment with different
semantics -- just not in the stdlib. The idea was also that
eventually, from all those experiments, one would emerge that would be
fit for the stdlib.

Jasper St. Pierre:

So, if I may ask, what was the original goal of annotations? The PEP gives
some suggestions, but doesn't leave anything concrete. Was it designed to be
an aid to IDEs, or static analysis tools that inspect source code? Something
for applications themselves to munge through to provide special behaviors,
like a command line parser, or runtime static checker?

Guido's response:

Pretty much all of the above to some extent. But for me personally,
the main goal was always to arrive at a notation to specify type
constraints (and maybe other constraints) for arguments and return
values. I've toyed at various times with specific ways of combining
types. E.g. list[int] might mean a list of integers, and dict[str,
tuple[float, float, float, bool]] might mean a dict mapping strings to
tuples of three floats and a bool. But I felt it was much harder to
get consensus about such a notation than about the syntax for argument
annotations (think about how many objections you can bring in to these
two examples :-) -- I've always had a strong desire to use "var: type
= default" and to make the type a runtime expression to be evaluated
at the same time as the default.

And a tiny bit of humor from Ned Batchelder:

A telling moment for me was during an early Py3k keynote at PyCon (perhaps
it was in Dallas or Chicago?), Guido couldn't remember the word
"annotation," and said, "you know, those things that aren't type
declarations?" :-)

function annotations in python

If you are using python3.5, the best way is to use typing.Union

>>> from typing import Union
>>> import numpy as np
>>> def fun(data: Union[np.ndarray, list]):
pass

You could alternatively use typing.TypeVar if you are consistently using Union[t1, t2, ...]. (Plus you can add and delete types from the TypeVar more easily than many Unions in your code)

>>> from typing import TypeVar
>>> import numpy as np
>>> import array
>>> Ar = TypeVar('Ar', np.ndarray, list, array.array)

This code would then associate Ar with lists, array.arrays, and numpy arrays.

Python3 function annotations for type hinting versus Boo

Boo is a great Python-like statically-typed language, but keep in mind that there more differences than just static typing. Actually you can also do duck typing on Boo.

Technically, I'd say the biggest difference is that Boo runs on Mono/.Net so the libraries and framework are totally different.

SharpDevelop and MonoDevelop both have good support for Boo. There's also a Visual Studio 2010 plugin that adds Boo support. It's still alpha, yet already usable.

PYTHON3: How to use a type defined later for Function Annotations?

I found a workaround, which is much similar to c/c++

Tested in Pycharm 3

class A: pass
class B: pass

class A(object):

def foo(self, b: B) -> B:

#CAN auto complete
b.bar()

return B()

class B(object):

def bar(self) -> A:

return A()

#CAN auto complete
A().foo(B()).bar()

Examples of open-source projects using Python 3 function annotations

I have never seen this feature used in the wild. However, one potential use of function annotations that I explored in an article on Python 3 that I wrote for USENIX ;Login: was for enforcing contracts. For example, you could do this:

from functools import wraps

def positive(x):
'must be positive'
return x > 0

def negative(x):
'must be negative'
return x < 0

def ensure(func):
'Decorator that enforces contracts on function arguments (if present)'
return_check = func.__annotations__.get('return',None)
arg_checks = [(name,func.__annotations__.get(name))
for name in func.__code__.co_varnames]

@wraps(func)
def assert_call(*args):
for (name,check),value in zip(arg_checks,args):
if check:
assert check(value),"%s %s" % (name, check.__doc__)
result = func(*args)
if return_check:
assert return_check(result),"return %s" % (return_check.__doc__)
return result
return assert_call

# Example use
@ensure
def foo(a:positive, b:negative) -> positive:
return a-b

If you do this, you'll see behavior like this:

>>> foo(2,-3)
5
>>> foo(2,3)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "ensure.py", line 22, in assert_call
assert check(value),"%s %s" % (name, check.__doc__)
AssertionError: b must be negative

I should note that the above example needs to be fleshed out more to work properly with default arguments, keyword arguments, and other details. It's only a sketch of an idea.

Now, whether or not this is a good idea or not, I just don't know. I'm inclined to agree with Brandon that the lack of composability is a problem--especially if annotations start to be used by different libraries for different purposes. I also wonder if something like this contract idea couldn't be accomplished through decorators instead. For example, making a decorator that was used like this (implementation left as an exercise):

@ensure(a=positive,b=negative)
def foo(a,b):
return a-b

A historial note, I've always kind of felt that function annotations were an outgrowth of discussions about "optional static typing" that the Python community had more than 10 years ago. Whether that was the original motivation or not, I just don't know.

How to annotate a Python3 method that returns self?

As of Python 3.11, you will be able to use Self to annotate the return type:

from typing import Self

class X:
def yaya(self, x: int):
# Do stuff here
pass

def chained_yaya(self, x: int) -> Self:
# Do stuff here
return self
  • PEP 673

What does - mean in Python function definitions?

It's a function annotation.

In more detail, Python 2.x has docstrings, which allow you to attach a metadata string to various types of object. This is amazingly handy, so Python 3 extends the feature by allowing you to attach metadata to functions describing their parameters and return values.

There's no preconceived use case, but the PEP suggests several. One very handy one is to allow you to annotate parameters with their expected types; it would then be easy to write a decorator that verifies the annotations or coerces the arguments to the right type. Another is to allow parameter-specific documentation instead of encoding it into the docstring.



Related Topics



Leave a reply



Submit