How to Check Blas/Lapack Linkage in Numpy and Scipy

How to check BLAS/LAPACK linkage in NumPy and SciPy?

The method numpy.show_config() (or numpy.__config__.show()) outputs information about linkage gathered at build time. My output looks like this. I think it means I am using the BLAS/LAPACK that ships with Mac OS.

>>> import numpy as np
>>> np.show_config()

lapack_opt_info:
extra_link_args = ['-Wl,-framework', '-Wl,Accelerate']
extra_compile_args = ['-msse3']
define_macros = [('NO_ATLAS_INFO', 3)]
blas_opt_info:
extra_link_args = ['-Wl,-framework', '-Wl,Accelerate']
extra_compile_args = ['-msse3', '-I/System/Library/Frameworks/vecLib.framework/Headers']
define_macros = [('NO_ATLAS_INFO', 3)]

Find out if/which BLAS library is used by Numpy

numpy.show_config() doesn't always give reliable information. For example, if I apt-get install python-numpy on Ubuntu 14.04, the output of np.show_config() looks like this:

blas_info:
libraries = ['blas']
library_dirs = ['/usr/lib']
language = f77
lapack_info:
libraries = ['lapack']
library_dirs = ['/usr/lib']
language = f77
atlas_threads_info:
NOT AVAILABLE
blas_opt_info:
libraries = ['blas']
library_dirs = ['/usr/lib']
language = f77
define_macros = [('NO_ATLAS_INFO', 1)]
atlas_blas_threads_info:
NOT AVAILABLE
openblas_info:
NOT AVAILABLE
lapack_opt_info:
libraries = ['lapack', 'blas']
library_dirs = ['/usr/lib']
language = f77
define_macros = [('NO_ATLAS_INFO', 1)]
...

It looks as though numpy is using the standard CBLAS library. However, I know for a fact that numpy is using OpenBLAS, which I installed via the libopenblas-dev package.


The most definitive way to check on *nix is to use ldd to find out which shared libraries numpy links against at runtime (I don't own a Mac, but I think you can use otool -L in place of ldd).

  • For versions of numpy older than v1.10:

    ~$ ldd /<path_to_site-packages>/numpy/core/_dotblas.so

    If _dotblas.so doesn't exist, this probably means that numpy failed to detect any BLAS libraries when it was originally compiled, in which case it simply doesn't build any of the BLAS-dependent components.

  • For numpy v1.10 and newer:

    _dotblas.so has been removed, but you can check the dependencies of multiarray.so instead:

    ~$ ldd /<path_to_site-packages>/numpy/core/multiarray.so

Looking at the version of numpy I installed via apt-get:

~$ ldd /usr/lib/python2.7/dist-packages/numpy/core/_dotblas.so 
linux-vdso.so.1 => (0x00007fff12db8000)
libblas.so.3 => /usr/lib/libblas.so.3 (0x00007fce7b028000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fce7ac60000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007fce7a958000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007fce7a738000)
/lib64/ld-linux-x86-64.so.2 (0x00007fce7ca40000)

/usr/lib/libblas.so.3 is actually the start of a chain of symlinks. If I follow them to their ultimate target using readlink -e, I see that they point to my OpenBLAS shared library:

~$ readlink -e /usr/lib/libblas.so.3
/usr/lib/openblas-base/libblas.so.3

Unsure whether my version of Python/numpy is using optimized BLAS/LAPACK libraries?

Kind of. OpenBLAS is quite alright. I just took the first link, that I could find on google looking for "OpenBLAS, ATLAS, MKL comparison".

http://markus-beuckelmann.de/blog/boosting-numpy-blas.html

Now, this is not the whole story. The differences might not be / be slightly / be a lot different depending on the algorithms, which you need. There is really not much that can be done than to run your own code linked against the different implementations.

My favourites in average across all sorts of linear algebraic problems, SVDs, Eigs, real and pseudo inversions, factorisations ... single core / multicore on the different OSes:

MacOS: Accelerated framework (comes along with the OS)
Linux/Windows:

  1. MKL
  2. with great distance but still quiet alright: ATLAS and OpenBLAS on par
  3. ACML has been always a disappointment to me even on AMD processors

TLDR: Your setup is fine. But if you want to squeeze the last drop of blood out of your CPU / RAM / Mainboard combination you need MKL. It comes of course with quite a price tag, but if you can get hardware half as expensive in return, maybe worth it. And if you write an open source package, you may use MKL free of charge for development purposes.

Can't get Python3 numpy to see BLAS/LAPACK

If anyone's interested in the answer, I managed to finally get OpenBLAS recognized in numpy, and received a decent speed boost.

To do it you must first uninstall python3-numpy and any numpy installed via pip3. Then manually compile OpenBLAS and numpy as explained in Compiling numpy with OpenBLAS integration.

Installing the default packages via apt-get or pip apparently doesn't seem to link in any BLAS library by default, at least not on the TinkerBoard Linaro OS...

Calling BLAS / LAPACK directly using the SciPy interface and Cython

According to netlib dger(M, N, ALPHA, X INCX, Y, INCY, A, LDA) performs A := alpha*x*y**T + A. So A should be all zeros to get the outer product of X and Y.

Functions from Scipy, Blas, or Lapack that compute only upper triangular matrix

If you click "source" on the scipy.linalg.lu_factor page, https://docs.scipy.org/doc/scipy-0.14.0/reference/generated/scipy.linalg.lu_factor.html, it brings you to https://github.com/scipy/scipy/blob/v0.14.0/scipy/linalg/decomp_lu.py#L17

which shows that the LAPACK function you're after is *getrf.



Related Topics



Leave a reply



Submit