Python Paramiko - Run command
Python has lots of ways to perform string formatting. One of the simplest is to simply concatenate the parts of your string together:
#!/usr/bin/env python
import paramiko
hostname = "192.168.3.4"
port = 22
username = "username"
password = "mypassword"
y = "2012"
m = "02"
d = "27"
def do_it():
s = paramiko.SSHClient()
s.load_system_host_keys()
s.connect(hostname, port, username, password)
command = "ls /home/user/images/cappi/03000/" + y + "/" + m + "/" + d
stdin, stdout, stderr = s.exec_command(command)
for line in stdout.readlines():
print line
s.close()
if __name__ == "__main__":
do_it()
Running remote command in background using paramiko
If you execute a command in the background, the session ends immediately, not waiting for the command to complete.
Try this command:
ssh -t user_name@host_name "ls && sleep 15 && ls &"
It's an equivalent of your Python code. And you will see that it produces no output either. It won't even wait the 15 seconds, it closes immediately. Same as your Python code.
$ ssh -t user_name@host_name "ls && sleep 15 && ls &"
Connection to host_name closed.
$
Anyway, I do not see a point of executing a command in the background, if you then wait for the command to finish.
Run multiple commands in different SSH servers in parallel using Python Paramiko
Indeed, the problem is that you close the SSH connection. As the remote process is not detached from the terminal, closing the terminal terminates the process. On Linux servers, you can use nohup
. I do not know what is (if there is) a Windows equivalent.
Anyway, it seems that you do not need to close the connection. I understood, that you are ok with waiting for all the commands to complete.
stdouts = []
clients = []
# Start the commands
commands = zip(ip_list[1:], user_list[1:], password_list[1:])
for i, (ip, user, password) in enumerate(commands, 1):
print("Open session in: " + ip + "...")
client = paramiko.SSHClient()
client.connect(ip, user, password)
command = \
f"cd {path} && " + \
f"python {python_script} {cluster} -type worker -index {i} -batch 64 " + \
f"> {path}/logs/'command output'/{ip_list[i]}.log 2>&1"
stdin, stdout, stderr = client.exec_command(command)
clients.append(client)
stdouts.append(stdout)
# Wait for commands to complete
for i in range(len(stdouts)):
stdouts[i].read()
clients[i].close()
Note that the above simple solution with stdout.read()
is working only because you redirect the commands output to a remote file. Were you not, the commands might deadlock.
Without that (or if you want to see the command output locally) you will need a code like this:
while any(x is not None for x in stdouts):
for i in range(len(stdouts)):
stdout = stdouts[i]
if stdout is not None:
channel = stdout.channel
# To prevent losing output at the end, first test for exit,
# then for output
exited = channel.exit_status_ready()
while channel.recv_ready():
s = channel.recv(1024).decode('utf8')
print(f"#{i} stdout: {s}")
while channel.recv_stderr_ready():
s = channel.recv_stderr(1024).decode('utf8')
print(f"#{i} stderr: {s}")
if exited:
print(f"#{i} done")
clients[i].close()
stdouts[i] = None
time.sleep(0.1)
If you do not need to separate the stdout and stderr, you can greatly simplify the code by using Channel.set_combine_stderr
. See Paramiko ssh die/hang with big output.
Regarding your question about SSHClient.close
: If you do not call it, the connection will be closed implicitly, when the script finishes, when Python garbage collector cleans up the pending objects. It's a bad practice. And even if Python won't do it, the local OS will terminate all connections of the local Python process. That's a bad practice too. In any case, that will terminate the remote processes along.
How to create a python script to ssh and run commands on multiple linux devices
Turns out I've found the answer (the perfect script for what I want).
Here it is:
import paramiko
import time
p = paramiko.SSHClient()
cred = open("hostnames.csv","r")
for i in cred.readlines():
line=i.strip()
ls =line.split(",")
print(ls[0])
p.set_missing_host_key_policy(paramiko.AutoAddPolicy())
p.connect("%s"%ls[0],port =22, username = "%s"%ls[1], password="%s"%ls[2])
stdin, stdout, stderr = p.exec_command('sudo pwd' , get_pty=True)
stdin.write("%s\n"%ls[2])
stdin.flush()
opt = stdout.readlines()
opt ="".join(opt)
print(opt)
cred.close()
So, it basically reads the csv file with is formatted as : hostname,username,password
and runs the script sequentially. Also, it can run sudo
commands this way :)
Paramiko: Execute a script on a remote server and actually see it running on remote desktop
Ned,
I would try referring to this post here:
https://serverfault.com/questions/690852/use-powershell-to-start-a-gui-program-on-a-remote-machine
Executing the command remotely using PsExec with the -i arg seems to have worked for them. Documentation and info on PsExec can be found at https://learn.microsoft.com/en-us/sysinternals/downloads/psexec .
Related Topics
How to Add a Question Mark [] Button on the Top of a Tkinter Window
Assigning String with Boolean Expression
Convert from Ascii String Encoded in Hex to Plain Ascii
Use Python Requests to Download CSV
Preprocessing in Scikit Learn - Single Sample - Depreciation Warning
How to Install a Python Package from Within Ipython
Overwriting File in Ziparchive
Converting Currency with $ to Numbers in Python Pandas
Extract a String Between Double Quotes
Validation of a Password - Python
Python CSV.Reader: How to Return to the Top of the File
Django-Registration & Django-Profile, Using Your Own Custom Form
Beautifulsoup:Difference Between .Find() and .Select()
Finding Elements Not in a List