Paramiko Ssh Die/Hang with Big Output

Paramiko ssh die/hang with big output

If the ls -R prints lots of error output (what is likely if the current user is not root => does not have access to all folders), your code deadlocks eventually.

It's because, the output buffer of the error stream eventually fills, so the ls stops working, waiting for you to read the stream (empty the buffer).

While you wait for the regular output stream to finish, what it never does, as the ls waits for you to read the error stream, what you never do.

You have to read both streams in parallel (see Run multiple commands in different SSH servers in parallel using Python Paramiko).

Or even easier, use the Channel.set_combine_stderr to merge both streams into one.

Paramiko channel stucks when reading large ouput

i see no problem related to stdout channel, but i'm not sure about the way you are handling stderr. Can you confirm, its not the stderr capturing thats causing problem?
I'll try out your code and let you know.

Update:
when a command you execute gives lots of messages in STDERR, your code freezes. I'm not sure why, but recv_stderr(600) might be the reason.
So capture error stream the same way you capture standard output.
something like,

contents_err = StringIO.StringIO()

data_err = chan.recv_stderr(1024)
while data_err:
contents_err.write(data_err)
data_err = chan.recv_stderr(1024)

you may even first try and change recv_stderr(600) to recv_stderr(1024) or higher.

Paramiko with continuous stdout

Of course there is a way to do this. Paramiko exec_command is async, buffers are filled while data arrives regardless of your main thread.

In your example stdout.read(size=None) will try to read the full buffer size at once. Since new data is always arriving, it won't ever exit. To avoid this, you could just try to read from stdout in smaller chunks. Here's an example that reads buffers bytewise and yields lines once a \n is received.

stdin,stdout,stderr = ssh.exec_command("while true; do uptime; done")

def line_buffered(f):
line_buf = ""
while not f.channel.exit_status_ready():
line_buf += f.read(1)
if line_buf.endswith('\n'):
yield line_buf
line_buf = ''

for l in line_buffered(stdout):
print l

You can increase performance by tweaking the code to use select.select() and by having bigger chunk sizes, see this answer that also takes into account common hang and remote command exit detection scenarios that may lead to empty responses.

readline hangs on paramiko.Channel when reading watch command output

This probably hangs because watch does not produce newlines. If one replaces

for line in stdout:
print(line.strip())

with a busy loop with

stdout.readline(some_fixed_size)

it can be seen that the bytes never contain a newline character. Therefore, this is a very special case and is not related to other hangs reported in other issues and SO questions.

Paramiko: read from standard output of remotely executed command

You have closed the connection before reading lines:

import paramiko
client=paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
com="ls ~/desktop"
client.connect('MyIPAddress',MyPortNumber, username='username', password='password')
output=""
stdin, stdout, stderr = client.exec_command(com)

print "ssh succuessful. Closing connection"
stdout=stdout.readlines()
client.close()
print "Connection closed"

print stdout
print com
for line in stdout:
output=output+line
if output!="":
print output
else:
print "There was no output for this command"


Related Topics



Leave a reply



Submit