Ssh Command Output to Save in a Text File in Shell Script

ssh command output to save in a text file in shell script

To save ssh's output in local file "file.log":

ssh hostname > file.log << EOF
pwd
ps -ef | grep Consumer | cut -f6 -d' '
EOF

Bash command output not saving to text file or variable

Since "permission denied" is typically considered an error, is the output being routed to stderr instead of stdout? If so, you need to use 2> temp.txt or > temp.txt 2>&1.

More information:

On many systems, program output is broken up into multiple streams, most commonly stdout (standard output) and stderr (standard error). When you use >, that only redirects stdout, but 2> can be used to redirect stderr. (This is useful if you want normal output and errors to go to two different files.)

The syntax 2>&1 means "take all output on stderr and redirect it to stdout", so your file would contain output from both streams.

How to send stdout of a ssh command in a file?

If you want to pass regular stdout to a file, your command does that.
Your issue is you get an error; and this is printed to stderr. To pass stderr you need to add this to your command

ssh -o "BatchMode=yes" -o "ConnectTimeout=5" admin@10.10.10.10 > /var/tmp/.result.txt 2>1&

This tells bash to redirect output to .results.txt, and then redirect stderr to stdout, so they both are printed to the file. It reuses the file descriptor which stdout uses.

how-to save result of a command in an if statement to a text file

It's inherently impossible to do exactly this, because a command (or pipeline of commands) produces output as it runs, but doesn't produce an exit status (success/failure) until it's finished running; therefore, you can't decide whether to save the output or not until it's already finished being output.

What you can do is store the output somewhere temporary, and then either save that or not. I'm not sure if this is quite what you're trying to do, but maybe something like this (using a variable as the temporary storage):

while true
do
output=$(ss --tcp --processes | grep 53501)
if [ -n "$output" ]; then
echo "$output" >/tmp/cmd.out
fi
done

Console output into text file when generating shell script

It's looking like you are expecting the file to be created locally, but of course, it's in the command you are sending to the remote server, so that's where it's being created.

I don't think you need any of these redirections anyway.

# no need to touch, either
for loop in looopity loop; do
echo "diff something" # no redirect!
done |
ssh remoteserver >"$DST_DIR"/output.txt

Add a tee before the ssh if you really genuinely need to store the commands you are sending to the remote host.

save command out to variable and show command output without waiting for it to complete

I think you want a tool called tee:

The tee utility copies standard input to standard output, making a copy in zero
or more files. The output is unbuffered.

It will output to the screen and also to a file. you use it like this:

cat file1.txt | tee -a file2.txt

Use of cat is just an example. Any command on the left side of the pipe should work.

Copying variables to local text file from multiple ssh awk output

First of all, your ssh command. Using a here-document is a good idea. You can improve it in two ways by:

  • indenting with TABs owing to the <<- syntax. This is purely cosmetic and makes your code more readable.
  • avoiding the escaping of special characters like $ by quoting EOF. This is not only cosmetic but makes your code less error-prone.

This gives:

ssh {$CURRENT_ENV} <<- 'EOF'
VAL=$(df -h | awk '$6 == "/" {print $5; exit}')
echo "$VAL > testfile.txt"
exit
EOF

(we could even put a tabulation before EOF)

Now your code :

  • You don't tell us what CURRENT_ENV is. I assume this is something like user@server. To use that variable, write "$CURRENT_ENV", not '{CURRENT_ENV}'. Unless you know what you are doing, when using a variable, always enclose it in double-quotes to avoid any undesirable side-effect.
  • You put the result of df into variable VAL and write its content to textfile.txt:
    • As a universal convention, use lower cases for you variable names (unless they are exported to the environment which is not the case here); i.e. this should be val, not VAL.
    • echo "$val > testfile.txt" won't write anything into textfile.txt because your redirection is inside double-quotes, and thus belongs to the text that is echo-ed. Proper command would be echo "$val" > testfile.txt
    • Now, think about it: all this, including this echo is executed on the remote server, therefore this will create the file testfile.txt there, not on your machine. This is not what you want, so let's remove that echo line. Let's also remove val= since val is not needed any longer.
    • The exit command is of no need. Once the last command will be read and executed, the ssh session ends anyway.

We are left with this;

ssh "$CURRENT_ENV" <<- 'EOF'
df -h | awk '$6 == "/" {print $5; exit}'
EOF

(remember this is a tabulation before df but single spaces wouldn't harm in this case)

As it is now, your code outputs everything to your terminal. Let's now redirect this to your local file testfile.txt :

ssh {$CURRENT_ENV} <<- 'EOF' > testfile.txt
df -h | awk '$6 == "/" {print $5; exit}'
EOF

OK, this works for one server. You told us there were actually several ones. You don't show us your code, so I will assume there is a loop somewhere:

for ssh_target in u1@server1 u2@server2 ...; do
ssh "$ssh_target" <<- 'EOF' > testfile.txt
df -h | awk '$6 == "/" {print $5; exit}'
EOF
done

Almost there! The problem with this command is that each loop overwrites the content of testfile.txt. The solution is to redirect the for loop, NOT the ssh command inside it:

for ssh_target in u1@server1 u2@server2 ...; do
ssh "$ssh_target" <<- 'EOF'
df -h | awk '$6 == "/" {print $5; exit}'
EOF
done > testfile.txt

(the redirection must be put after done)

Here it is!

running multiple commands through ssh and storing the outputs in different files

Unless you can parse the actual outputs of the two commands and distinguish which is which, you can't. You will need two separate ssh sessions:

ssh -i my_key user@ip command1 > command1.txt
ssh -i my_key user@ip command2 > command2.txt

You could also redirect the outputs to files on the remote machine and then copy them to your local machine:

ssh -i my_key user@ip 'command1 > command1.txt; command2 > command2.txt'
scp -i my_key user@ip:'command*.txt' .


Related Topics



Leave a reply



Submit