Interfacing Python and Torch7(Lua) via Shared Library

interfacing Python and Torch7(Lua) via shared library

On Linux Lua modules don't link to the Lua library directly but instead expect to find the Lua API functions already loaded. This is usually done by exporting them from the interpreter using the -Wl,-E linker flag. This flag only works for symbols in executables, not shared libraries. For shared libraries there exists something similar: the RTLD_GLOBAL flag for the dlopen function. By default all shared libraries listed on the compiler command line are loaded using RTLD_LOCAL instead, but fortunately Linux reuses already opened library handles. So you can either:

Preload the Lua(JIT) library using RTLD_GLOBAL before it gets loaded automatically (which happens when you load libcluaf.so):

from ctypes import byref, cdll, c_int
import ctypes

lualib = ctypes.CDLL("libluajit-5.1.so", mode=ctypes.RTLD_GLOBAL)
l = cdll.LoadLibrary('absolute_path_to_so/libcluaf.so')
# ...

Or change the flags of the Lua(JIT) library handle afterwards using the RTLD_NOLOAD flag for dlopen. This flag is not in POSIX though, and you probably have to use C to do so. See e.g. here.

Best way for Lua script to call a C shared lib?

Lua cannot call C libraries out of the box. It does not ship with libffi, and as such doesn't work like ctypes.

Historically, lua is embedded into an application which in turn will add to the lua tables the needed functions and provide the lua stack manipulation to pass and return parameters.

Alien is a libffi adaptation and will work.

lunatic-python / lupa import issue on ubuntu

I was doing something similar. I could import lupa by preloading libluajit:

import ctypes
lualib = ctypes.CDLL("libluajit.so", mode=ctypes.RTLD_GLOBAL)
import lupa

I've got some ideas from the following discussion:

interfacing Python and Torch7(Lua) via shared library

By the way, there is a branch of Lupa that supports Torch:

Lupa for torch

How can you link against (non-standard) libraries from Terra?

Nevermind, I think I found what I need.

terralib.linklibrary(filename)

Load the dynamic library in file
filename. If header files imported with includec contain declarations
whose definitions are not linked into the executable in which Terra is
run, then it is necessary to dynamically load the definitions with
linklibrary. This situation arises when using external libraries with
the terra REPL/driver application.

Source: Terra docs

Run Lua script from Python

You can use a subprocess to run your Lua script and provide the function with it's arguments.

import subprocess

result = subprocess.check_output(['lua', '-l', 'demo', '-e', 'test("a", "b")'])
print(result)

result = subprocess.check_output(['lua', '-l', 'demo', '-e', 'test2("a")'])
print(result)
  • the -l requires the given library (your script)
  • the -e is the code that should be executed on start (your function)

The value of result will be the value of STDOUT, so just write your return value to it and you can simply read it in your Python script. The demo Lua script I used for the example simply prints the arguments:

function test (a, b)
print(a .. ', ' .. b)
end

function test2(a)
print(a)
end

In this example both files have to be in the same folder and the lua executable must be on your PATH.


An other solution where only one Lua VM is spawned is using pexpect and run the VM in interactive mode.

import pexpect

child = pexpect.spawn('lua -i -l demo')
child.readline()

child.sendline('test("a", "b")')
child.readline()
print(child.readline())

child.sendline('test2("c")')
child.readline()
print(child.readline())

child.close()

So you can use sendline(...) to send a command to the interpreter and readline() to read the output. The first child.readline() after the sendline() reads the line where the command will be print to STDOUT.

torch.CharStorage doesn't read data from a file when size isn't provided

You appear to be running into distro bug #245, introduced by commit 6a35cd9. As stated in torch7 bug #1064, you can work around it by either updating your pkg/torch submodule to commit 89ede3b or newer, or rolling it back to commit 2186e41 or older.



Related Topics



Leave a reply



Submit