Change parent shell's environment from a subprocess
This can only be done with the parent shell's involvement and assistance. For a real-world example of a program that does this, you can look at how ssh-agent
is supposed to be used:
eval "$(ssh-agent -s)"
...reads the output from ssh-agent
and runs it in the current shell (-s
specifies Bourne-compatible output, vs csh).
If you're using Python, be sure to use pipes.quote()
(or, for Python 3.x, shlex.quote()
) to process your output safely:
import pipes
dirname='/path/to/directory with spaces'
foo_val='value with * wildcards * that need escaping and \t\t tabs!'
print 'cd %s; export FOO=%s;' % (pipes.quote(dirname), pipes.quote(foo_val))
...as careless use can otherwise lead to shell injection attacks.
By contrast, if you're writing this as an external script in bash, be sure to use printf %q
for safe escaping (though note that its output is targeted for other bash shells, not for POSIX sh compliance):
#!/bin/bash
dirname='/path/to/directory with spaces'
foo_val='value with * wildcards * that need escaping and \t\t tabs!'
printf 'cd %q; export FOO=%q;' "$dirname" "$foo_val"
If, as it appears from your question, you want your command to appear to be written as a native shell function, I would suggest wrapping it in one (this practice can also be used with command_not_found_handle
). For instance, installation can involve putting something like the following in one's .bashrc
:
my_command() {
eval "$(command /path/to/my_command.py "$@")"
}
...that way users aren't required to type eval
.
Is it possible to change the Environment of a parent process in Python?
No process can change its parent process (or any other existing process' environment).
You can, however, create a new environment by creating a new interactive shell with the modified environment.
You have to spawn a new copy of the shell that uses the upgraded environment and has access to the existing stdin, stdout and stderr, and does its reinitialization dance.
You need to do something like use subprocess.Popen to run /bin/bash -i
.
So the original shell runs Python, which runs a new shell. Yes, you have a lot of processes running. No it's not too bad because the original shell and Python aren't really doing anything except waiting for the subshell to finish so they can exit cleanly, also.
How to set parent process' shell env from child process
Environment variables are copied from parent to child, they are not shared or copied in the other direction. All export
does is make an environment variable in the child, so its children will see it.
Simplest way is to echo
in the child process (I'm assuming it is a shell script) and capture it in python using a pipe.
Python:
import subprocess
proc = subprocess.Popen(['bash', 'gash.sh'], stdout=subprocess.PIPE)
output = proc.communicate()[0]
print "output:", output
Bash (gash.sh):
TEMP_VAR='yellow world'
echo -n "$TEMP_VAR"
Output:
output: yellow world
Set a parent shell's variable from a subshell
The whole point of a subshell is that it doesn't affect the calling session. In bash a subshell is a child process, other shells differ but even then a variable setting in a subshell does not affect the caller. By definition.
Do you need a subshell? If you just need a group then use braces:
a=3
{ a=4;}
echo $a
gives 4
(be careful of the spaces in that one). Alternatively, write the variable value to stdout and capture it in the caller:
a=3
a=$(a=4;echo $a)
echo $a
avoid using back-ticks ``, they are deprecated and can be difficult to read.
Python subprocess/Popen with a modified environment
I think os.environ.copy()
is better if you don't intend to modify the os.environ for the current process:
import subprocess, os
my_env = os.environ.copy()
my_env["PATH"] = "/usr/sbin:/sbin:" + my_env["PATH"]
subprocess.Popen(my_command, env=my_env)
Run Python script that affect parent shell (changes environment variables, runs other scripts, etc.)
how can you affect parent shell that execute python script from inside the script (add some environments, run other script, etc.)?
It is not possible to do that, unless your operating system is broken. Process isolation is one of the very basic concepts of an operating system.
Instead, resaerch what venv
does and how it works and what activate
script does and just add the proper directory to python module search path.
Python: Exporting environment variables in subprocess.Popen(..)
The substitution of environment variables on the command line is done by the shell, not by /bin/echo. So you need to run the command in a shell to get the substitution:
In [22]: subprocess.Popen('/bin/echo $TEST_VARIABLE', shell=True, env=d).wait()
1234
Out[22]: 0
That doesn't mean the environment variable is not set when shell=False
, however. Even without shell=True
, the executable does see the enviroment variables set by the env
parameter. For example, date
is affected by the TZ
environment variable:
In [23]: subprocess.Popen(["date"], env={'TZ': 'America/New_York'}).wait()
Wed Oct 29 22:05:52 EDT 2014
Out[23]: 0
In [24]: subprocess.Popen(["date"], env={'TZ': 'Asia/Taipei'}).wait()
Thu Oct 30 10:06:05 CST 2014
Out[24]: 0
how to make subprocess called with call/Popen inherit environment variables
Regarding
If I were doing this directly at the command line, I'd "source" a script called mySetUpFreeSurfer.sh that does nothing but set three environment variables, and then "source" another script, FreeSurferEnv.sh.
I think you would be better off using Python to automate the process of writing
a shell script newscript.sh
, and then calling this script with one callsubprocess.check_output
(instead of many calls to Popen
, check_output
,call
, etc.):
newscript.sh:
#!/bin/bash
source ~/scripts/mySetUpFreeSurfer.sh
source /usr/local/freesurfer/FreeSurferEnv.sh
recon-all -i /media/foo/bar -subjid s1001
...
and then calling
subprocess.check_output(['newscript.sh'])
import subprocess
import tempfile
import os
import stat
with tempfile.NamedTemporaryFile(mode='w', delete=False) as f:
f.write('''\
#!/bin/bash
source ~/scripts/mySetUpFreeSurfer.sh
source /usr/local/freesurfer/FreeSurferEnv.sh
''')
root = "/media/foo/"
for sub_dir in os.listdir(root):
sub = "s" + sub_dir[0:4]
anat_dir = os.path.join(root, sub_dir, "anatomical")
for directory in os.listdir(anat_dir):
time_dir = os.path.join(anat_dir, directory)
for d in os.listdir(time_dir):
dicoms_dir = os.path.join(time_dir, d, 'dicoms')
dicom_list = os.listdir(dicoms_dir)
dicom = dicom_list[0]
path = os.path.join(dicoms_dir, dicom)
cmd1 = "recon-all -i {} -subjid {}\n".format(path, sub)
f.write(cmd1)
cmd2 = "recon-all -all -subjid {}\n".format(sub)
f.write(cmd2)
filename = f.name
os.chmod(filename, stat.S_IRUSR | stat.S_IXUSR)
subprocess.call([filename])
os.unlink(filename)
By the way,
def source(script, update=1):
pipe = Popen(". %s; env" % script, stdout=PIPE, shell=True)
data = pipe.communicate()[0]
env = dict((line.split("=", 1) for line in data.splitlines()))
if update:
os.environ.update(env)
return env
is broken. For example, if script
contains something like
VAR=`ls -1`
export VAR
then
. script; env
may return output like
VAR=file1
file2
file3
which will result in source(script)
raising a ValueError
:
env = dict((line.split("=", 1) for line in data.splitlines()))
ValueError: dictionary update sequence element #21 has length 1; 2 is required
There is a way to fix source
: have env
separate environment variables with a zero byte instead of the ambiguous newline:
def source(script, update=True):
"""
http://pythonwise.blogspot.fr/2010/04/sourcing-shell-script.html (Miki Tebeka)
http://stackoverflow.com/questions/3503719/#comment28061110_3505826 (ahal)
"""
import subprocess
import os
proc = subprocess.Popen(
['bash', '-c', 'set -a && source {} && env -0'.format(script)],
stdout=subprocess.PIPE, shell=False)
output, err = proc.communicate()
output = output.decode('utf8')
env = dict((line.split("=", 1) for line in output.split('\x00') if line))
if update:
os.environ.update(env)
return env
Fixable or not, however, you are still probably better off constructing a
conglomerate shell script (as shown above) than you would be parsing env
and
passing env
dicts to subprocess
calls.
Related Topics
Creating a Symbolic in Shared Volume of Docker and Accessing It in Host MAChine
Bring the Current Python Program to Background
"Ssl Module in Python Is Not Available" When Installing Package with Pip3
Remove Duplicate Dict in List in Python
Beyond Top Level Package Error in Relative Import
Typeerror: Unsupported Operand Type(S) for -: 'Str' and 'Int'
How Does Asyncio Actually Work
Get Rows Based on Distinct Values from One Column
Why Does Loading the Libc Shared Library Have "'Libraryloader' Object Is Not Callable" Error
Gdb Pretty Printing with Python a Recursive Structure
How to Redirect the Stdout into Some Sort of String Buffer
How to Round to 2 Decimals with Python
How to Run Multiple Python Versions on Windows
How to Open Multiple Files Using "With Open" in Python