Why Does Including This Module Not Override a Dynamically-Generated Method

Why does including this module not override a dynamically-generated method?

Let's do an experiment:

class A; def x; 'hi' end end
module B; def x; super + ' john' end end
A.class_eval { include B }

A.new.x
=> "hi" # oops

Why is that? The answer is simple:

A.ancestors
=> [A, B, Object, Kernel, BasicObject]

B is before A in the ancestors chain (you can think of this as B being inside A). Therefore A.x always takes priority over B.x.

However, this can be worked around:

class A
def x
'hi'
end
end

module B
# Define a method with a different name
def x_after
x_before + ' john'
end

# And set up aliases on the inclusion :)
# We can use `alias new_name old_name`
def self.included(klass)
klass.class_eval {
alias :x_before :x
alias :x :x_after
}
end
end

A.class_eval { include B }

A.new.x #=> "hi john"

With ActiveSupport (and therefore Rails) you have this pattern implemented as alias_method_chain(target, feature) http://apidock.com/rails/Module/alias_method_chain:

module B
def self.included(base)
base.alias_method_chain :x, :feature
end

def x_with_feature
x_without_feature + " John"
end
end

Update Ruby 2 comes with Module#prepend, which does override the methods of A, making this alias hack unnecessary for most use cases.

Dynamically extend existing method or override send method in ruby

Here's a solution for you. Although it's based on module inclusion and not inheriting from a class, I hope you will still find it useful.

module Parent
def self.included(child)
child.class_eval do
def prepare_for_work
puts "preparing to do some work"
end

# back up method's name
alias_method :old_work, :work

# replace the old method with a new version, which has 'prepare' injected
def work
prepare_for_work
old_work
end
end
end
end

class FirstChild
def work
puts "doing some work"
end

include Parent # include in the end of class, so that work method is already defined.
end

fc = FirstChild.new
fc.work
# >> preparing to do some work
# >> doing some work

Is overriding a module's method a good convention?

The rule I usually follow is: when the method has to be defined in the class including the module (e.g. module acts as an interface) i always do:

def method_that_needs_to_be_defined
raise NoMethodError
end

It's good practice, prevents unexpected calls to yet undefined method.

Example:

module Speaker
def speak
raise NoMethodError
end
end

class Bird < Animal
include Speaker

def speak
'chirp'
end
end

Dynamically changing the definition of a module and class without changing the module code directly

The following answer works, but is semi-great.
Basically we save the original method in a dummy variable myfct_original.
Then we redefine our function such that the variable MyClass.asdf is copied to the module, the original method is applied, and the variables is removed from the module - so for a short time it is available in the module.

asdf = 5
class MyClass:
def myfct(self, test=1):
return asdf

MyClass.asdf = asdf
MyClass.myfct_original = MyClass.myfct

del sys.modules[__name__].asdf
delattr(MyClass, 'myfct')

def newmyfct(self, *args, **kwargs):
setattr(sys.modules[__name__], 'asdf', MyClass.asdf)
result = MyClass.myfct_original(self, *args, **kwargs)
del sys.modules[__name__].asdf
return result

setattr(MyClass, 'myfct', newmyfct)

myobj = MyClass()
myobj.myfct()

Maybe someone can do better (see end of the question).

Is it possible to do dynamic method overriding from a method of another class? If so, How?

The key is to pass the right arguments to the method.
Let's have a closer look at the error you're getting in the first place:

TypeError: unbound method do_something() must be called with 
OtherPerson #instance as first argument (got nothing instead)

When you look at OtherPerson.do_something, it's clear that it's expecting an instance as its first parameter.
So now p.do_something refers to OtherPerson.do_something, it needs that first parameter.
Therefore, a correct call, in the current state, would be:

p.do_something(p)

Of course, this is not really nice, since you have to specify the instance twice.
That's because the method is now unbound: it does not know of the instance on which it is called, ie it does not know self.

The solution I'm proposing consists in making p.do_something refer to a function that calls OtherPerson.fo_something with p as first argument.


Let's have two classes, Foo and Bar, defined as follow:

class Foo:
def __init__(self, x):
self.x = x

def speak(self):
print("Foo says:", self.x)

class Bar:
def __init__(self, x):
self.x = x

def speak(self):
print("Bar says:", self.x)

Suppose you have a foo instance from the Foo class.
Now, you want to dynamically override its speak method, so that it calls Bar's instead.
You can simply reassign foo.speak to a function that calls Bar.speak.

>>> foo = Foo(2)
>>> foo.speak()
Foo says: 2
>>> foo.speak = lambda: Bar.speak(foo)
>>> foo.speak()
Bar says: 2

You can make it even more generic.
For the sake of example, let's write a function that takes an instance, a method name, and a target class, and overrides the instance's matching method with the target class':

def override(instance, method_name, target_class):
class_method = getattribute(target_class, method_name)

def new_method(*args, **kwargs):
return class_method(instance, *args, **kwargs)

setattribute(instance, method_name, new_method)

You can observe the same expected behaviour:

>>> foo = Foo(2)
>>> override(foo, "speak", Bar)
>>> foo.speak()
Bar says: 2

Dynamically attaching a method to an existing Python object generated with swig?

If you create a wrapper class, this will work with any other class, either built-in or not. This is called "containment and delegation", and it is a common alternative to inheritance:

class SuperDuperWrapper(object):
def __init__(self, origobj):
self.myobj = origobj
def __str__(self):
return "SUPER DUPER " + str(self.myobj)
def __getattr__(self,attr):
return getattr(self.myobj, attr)

The __getattr__ method will delegate all undefined attribute requests on your SuperDuperWrapper object to the contained myobj object. In fact, given Python's dynamic typing, you could use this class to SuperDuper'ly wrap just about anything:

s = "hey ho!"
sds = SuperDuperWrapper(s)
print sds

i = 100
sdi = SuperDuperWrapper(i)
print sdi

Prints:

SUPER DUPER hey ho!
SUPER DUPER 100

In your case, you would take the returned object from the function you cannot modify, and wrap it in your own SuperDuperWrapper, but you could still otherwise access it just as if it were the base object.

print sds.split()
['hey', 'ho!']

Passing a block to a dynamically created method

If I understood you correctly, you can solve this by saving passed block to an instance variable on class object and then evaling that in instance methods.

bl.call won't do here, because it will execute in the original context (that of a class) and you need to execute it in scope of this current instance.

module MyMod
def is_my_modiable(&block)
class_eval do
@stored_block = block # back up block
def new_method
bl = self.class.instance_variable_get(:@stored_block) # get from class and execute
instance_eval(&bl) if bl
self.mod = true
self.save!
end
end
end
end

class MyClass
extend MyMod

is_my_modiable do
puts "in my modiable block"
self.something_special
end

def something_special
puts "in something special"
end

attr_accessor :mod
def save!; end
end

MyClass.new.new_method
# >> in my modiable block
# >> in something special


Related Topics



Leave a reply



Submit