Import All the Functions of a Package Except One When Building a Package

import all the functions of a package except one when building a package

The NAMESPACE file is somewhat flexible here, as described in Writing R Extensions.

The two main import directives are:

import(PACKAGE)

which imports all objects in the namespace into your package. The second option is to do specific imports using:

importFrom(PACKAGE, foo)

which gives you access to foo() without needing the fully qualified reference PACKAGE::foo().

But these aren't the only two options. You can also use the except argument to exclude just a handful of imports:

import(PACKAGE, except=c(foo,bar))

which gives you everything from PACKAGE's namespace but foo() and bar(). This is useful - as in your case - for avoiding conflicts.

For roxygen, great catch on figuring out that you can do:

#' @rawNamespace import(PACKAGE, except = foo)

to pass a raw NAMESPACE directive through roxygen.

How do I import all functions from a package in python?

Pretty easy

__init__.py:

from simupy.blk import *
from simupy.info import *

Btw, just my two cents but it looks like you want to import your package's functions in __init__.py but perform actions in __main__.py.

Like

__init__.py:

from simupy.blk import *
from simupy.info import *

__main__.py:

from simupy import *

# your code
dir_path = ....

It's the most pythonic way to do. After that you will be able to:

  • Run your script as a proper Python module: python -m simupy
  • Use your module as library: import simupy; print(simupy.bar())
  • Import only a specific package / function: from simupy.info import bar.

For me it's part of the beauty of Python..

R import all but a couple of functions

Currently my best idea is

all <- getNamespaceExports("grid")
paste("@importFrom grid", paste(all[!(all %in% c("arrow", "unit"))], collapse = " "))
#[1] "@importFrom grid grid.edit pop.viewport ...

That's obviously not a good solution, but unlike for exports you can't use a regex for imports, i.e., there is no importPatternFrom.

Importing all functions from a package: from .* import *

  1. importlib allows you to import any Python module from a string name. You can automate it with going through the list of files in the path.

  2. It's more pythonic to use __all__. Check here for more details.

Is there a way to import all functions(using *) from a file without the imports of that file?

You can define the __all__ module.

In your Helper.py file, add a line

__all__ = ["foo", "bar"]  # All the objects you want to export

Then in SomeClass.py just use from helper.py import *, this will only import what is specified in __all__.

import everything from a module except a few methods

In case you don't have an access to the module, you can also simply remove these methods or variables from a global namespace. Here's how this could be done:

to_exclude = ['foo']

from somemodule import *

for name in to_exclude:
del globals()[name]

Excluding modules when importing everything in __init__.py

On the one hand there are many good reasons not to do star imports, but on the other hand, python is for consenting adults.

__all__ is the recommended approach to determining what shows up in a star import. Your approach is correct, and you can further sanitize the namespace when finished:

import types
__all__ = [name for name, thing in globals().items()
if not (name.startswith('_') or isinstance(thing, types.ModuleType))]
del types

While less recommended, you can also sanitize elements directly out of the module, so that they don't show up at all. This will be a problem if you need to use them in a function defined in the module, since every function object has a __globals__ reference that is bound to its parent module's __dict__. But if you only import math_helpers to call math_helpers.foo(), and don't require a persistent reference to it elsewhere in the module, you can simply unlink it at the end:

del math_helpers

Long Version

A module import runs the code of the module in the namespace of the module's __dict__. Any names that are bound at the top level, whether by class definition, function definition, direct assignment, or other means, live in the that dictionary. Sometimes, it is desirable to clean up intermediate variables, as I suggested doing with types.

Let's say your module looks like this:

test_module.py

import math
import numpy as np

def x(n):
return math.sqrt(n)

class A(np.ndarray):
pass

import types
__all__ = [name for name, thing in globals().items()
if not (name.startswith('_') or isinstance(thing, types.ModuleType))]

In this case, __all__ will be ['x', 'A']. However, the module itself will contain the following names: 'math', 'np', 'x', 'A', 'types', '__all__'.

If you run del types at the end, it will remove that name from the namespace. Clearly this is safe because types is not referenced anywhere once __all__ has been constructed.

Similarly, if you wanted to remove np by adding del np, that would be OK. The class A is fully constructed by the end of the module code, so it does not require the global name np to reference its parent class.

Not so with math. If you were to do del math at the end of the module code, the function x would not work. If you import your module, you can see that x.__globals__ is the module's __dict__:

import test_module

test_module.__dict__ is test_module.x.__globals__

If you delete math from the module dictionary and call test_module.x, you will get

NameError: name 'math' is not defined

So you under some very special circumstances you may be able to sanitize the namespace of mymath.py, but that is not the recommended approach as it only applies to certain cases.

In conclusion, stick to using __all__.

A Story That's Sort of Relevant

One time, I had two modules that implemented similar functionality, but for different types of end users. There were a couple of functions that I wanted to copy out of module a into module b. The problem was that I wanted the functions to work as if they had been defined in module b. Unfortunately, they depended on a constant that was defined in a. b defined its own version of the constant. For example:

a.py

value = 1

def x():
return value

b.py

from a import x

value = 2

I wanted b.x to access b.value instead of a.value. I pulled that off by adding the following to b.py (based on https://stackoverflow.com/a/13503277/2988730):

import functools, types

x = functools.update_wrapper(types.FunctionType(x.__code__, globals(), x.__name__, x.__defaults__, x.__closure__), x)
x.__kwdefaults__ = x.__wrapped__.__kwdefaults__
x.__module__ = __name__
del functools, types

Why am I telling you all this? Well, you can make a version of your module that does not have any stray names in your namespace. You won't be able to see changes to global variables in your functions though. This is just an exercise in pushing python beyond its normal usage. I highly don't recommend doing this, but here is a sample module that effectively freezes its __dict__ as far as the functions are concerned. This has the same members as test_module above, but with no modules in the global namespace:

import math
import numpy as np

def x(n):
return math.sqrt(n)

class A(np.ndarray):
pass

import functools, types, sys

def wrap(obj):
""" Written this way to be able to handle classes """
for name in dir(obj):
if name.startswith('_'):
continue
thing = getattr(obj, name)
if isinstance(thing, FunctionType) and thing.__module__ == __name__:
setattr(obj, name,
functools.update_wrapper(types.FunctionType(thing.func_code, d, thing.__name__, thing.__defaults__, thing.__closure__), thing)
getattt(obj, name).__kwdefaults__ = thing.__kwdefaults__
elif isinstance(thing, type) and thing.__module__ == __name__:
wrap(thing)

d = globals().copy()
wrap(sys.modules[__name__])
del d, wrap, sys, math, np, functools, types

So yeah, please don't ever do this! But if you do, stick it in a utility class somewhere.



Related Topics



Leave a reply



Submit