Dynamic Inheritance in Python

Dynamic inheritance in Python

Simply store the class-object in a variable (in the example below, it is named base), and use the variable in the base-class-spec of your class statement.

def get_my_code(base):

class MyCode(base):
def initialize(self):
...

return MyCode

my_code = get_my_code(ParentA)

Python dynamic inheritance: How to choose base class upon instance creation?

What about defining the ImageZIP class on function-level ?

This will enable your dynamic inheritance.

def image_factory(path):
# ...

if format == ".gz":
image = unpack_gz(path)
format = os.path.splitext(image)[1][1:]
if format == "jpg":
return MakeImageZip(ImageJPG, image)
elif format == "png":
return MakeImageZip(ImagePNG, image)
else: raise Exception('The format "' + format + '" is not supported.')

def MakeImageZIP(base, path):
'''`base` either ImageJPG or ImagePNG.'''

class ImageZIP(base):

# ...

return ImageZIP(path)

Edit: Without need to change image_factory

def ImageZIP(path):

path = unpack_gz(path)
format = os.path.splitext(image)[1][1:]

if format == "jpg": base = ImageJPG
elif format == "png": base = ImagePNG
else: raise_unsupported_format_error()

class ImageZIP(base): # would it be better to use ImageZip_.__name__ = "ImageZIP" ?
# ...

return ImageZIP(path)

dynamic inheritance with type and super

The parameterless form of super() relies on it being physically placed inside a class body - the Python machinnery them will, under the hood, create a __class__ cell variable referring that "physical" class (roughly equivalent to a non-local variable), and place it as the first parameter in the super() call.

For methods not written inside class statements, one have to resort to explicitly placing the parameters to super, and these are the child class, and the instance (self).

The easier way to do that in your code is to define the methods inside your factory function, so they can share a non-local variable containing the newly created class in the super call: ​


def class_B_factory(parent_class):

def B_init(self,**kwargs):
nonlocal newcls # <- a bit redundant, but shows how it is used here
​super(newcls, self).__init__(**kwargs)

def another_method(self,):
​​return 1

​ newcls = type(
​'B',
​(parent_class, some_other_parent_class),
​{'__init__':B_init,
​'another_method':another_method
​}
return newcls

If you have to define the methods outside of the factory function (which is likely), you have to pass the parent class into them in some form. The most straightforward would be to add a named-parameter (say __class__ or "parent_class"), and use functools.partial inside the factory to pass the parent_class to all methods in a lazy way:


from functools import partial
from inspect import signature

class A:
...

# the "parent_class" argument name is given a special treatement in the factory function:
def B_init(self, *, parent_class=None, **kwargs):
nonlocal newcls # <- a bit redundant, but shows how it is used here
​super([parent_class, self).__init__(**kwargs)

def another_method(self,):
​​return 1

def class_B_factory(parent_class, additional_methods, ...):
methods = {}
for name, method in additional_methods.items():
if "parent_class" in signature(method).parameters:
method = partial(method, parent_class=parent_class)
# we populate another dict instead of replacing methods
# so that we create a copy and don't modify the dict at the calling place.
methods[name] = method


​ newcls = type(
​'B',
​(parent_class, some_other_parent_class),
methods
)
return newcls

new_cls = class_B_factory(B, {"__init__": B_init, "another_method": another_method})

Enum inheriting from collections.abc.Set

Actually, what you are calling "2", that is, registering your enum as a virtual subclass is not a solution for your problem at all: registering a class as a virtual subclass does not change any of its behaviors - nor adds methods or attributes: it simply will return "True"when queried if its instances are instances of the superclass, or for the corresponding "issubclass" call.

What you have is a classic metaclass conflict - and since both metaclasses are well behaved and built to be used cooperatively - unless they are explicitly conflicting in some part, all you have to do is create a new metaclass that inherits from both metaclasses:

In [29]: from collections.abc import Set

In [30]: from enum import Enum, EnumMeta

In [31]: from abc import ABCMeta

In [33]: class CustomMeta(ABCMeta, EnumMeta): pass

In [34]: class MyClass(Set, Enum, metaclass=CustomMeta):
...: AB = {1}
...: CD = {1,2}
...:
...: def __iter__(self):
...: return iter(self.value)
...:
...: def __contains__(self, key):
...: key in self.value
...:
...: def __len__(self):
...: len(self.value)
...:

In [35]: list(MyClass.AB)
Out[35]: [1]

In [36]: list(MyClass.CD)
Out[36]: [1, 2]

Dynamically derive a class in python

I assume you mean base_type insteand of parent type. But the following should work

Class C():
def __init__(self, base_type):
if base_type == 'A':
self.__class__=A
else:
self.__class__=B

Some more details on this approach can be found here: http://harkablog.com/dynamic-state-machines.html

dynamic class inheritance using super

Solution 1: Using cls = type('ClassName', ...)

Note the solution of sadmicrowave creates an infinite loop if the dynamically-created class gets inherited as self.__class__ will correspond to the child class.

An alternative way which do not have this issue is to assigns __init__ after creating the class, such as the class can be linked explicitly through closure. Example:

# Base class
class A():
def __init__(self):
print('A')

# Dynamically created class
B = type('B', (A,), {})

def __init__(self):
print('B')
super(B, self).__init__()

B.__init__ = __init__

# Child class
class C(B):
def __init__(self):
print('C')
super().__init__()

C() # print C, B, A

Solution 2: Using MyClass.__name__ = 'ClassName'

An alternative way to dynamically create class is to define a class inside the function, then reassign the __name__ and __qualname__ attributes:

class A:

def __init__(self):
print(A.__name__)

def make_class(name, base):

class Child(base):
def __init__(self):
print(Child.__name__)
super().__init__()

Child.__name__ = name
Child.__qualname__ = name
return Child

B = make_class('B', A)

class C(B):

def __init__(self):
print(C.__name__)
super().__init__()

C() # Display C B A

Dynamic inheritance from Model

I'd say that the best way to handle your use-case is slightly different. You can pass a function to upload_to. This function will be provided the signature instance and filename of the uploaded file (docs).

You can check class of the instance with isinstance or something similar and change the output accordingly.

def upload_path(instance, filename):
if isinstance(instance, User):
return 'user_{0}/{1}'.format(instance.user.id, filename)
return 'photo/{1}'.format(filename)

Doing the same for default is a little bit harder. The default function won't be passed any reference to the model or field which will make it harder to override it based on model. You could simply add an image field to each of the concrete models, but I don't know if that would be appropriate.



Related Topics



Leave a reply



Submit