How to Avoid Sigchld Error in Bash Script That Uses Gnu Parallel

How To Avoid SIGCHLD error In Bash Script That Uses GNU Parallel

I think this might be a bug in parallel. There is a point in the code where the author is deleting the sigchld handler, presumably as a way of ignoring the signal. The perl documentation is silent on the effect that would have, suggesting to me that the result will be platform or implementation dependent and unreliable. The proper way to ignore a signal is to set the handler to "INGORE". I suggest trying version 20150222, an older version which does have this questionable code.

How To Install Or Switch To Older Version of GNU Parallel?

From README (replace 20160922 with the version you want):

wget http://ftpmirror.gnu.org/parallel/parallel-20160922.tar.bz2
bzip2 -dc parallel-20160922.tar.bz2 | tar xvf -
cd parallel-20160922
./configure && make && sudo make install

GNU parallel does not run in parallel on remote servers when using --onall

You are hitting a design decision: What does -j mean when you run --onall? The decision is that -j is the number of hosts to run on simultaneously (in your case 2). This was done so that it would be easy to run commands serially on a number of hosts in parallel.

What you can do, is wrap your parallel command with another parallel command:

parallel parallel --onall -S ${RH32},${RH64} --argsep // /shared/loc/script.sh // ::: param1 param2

This will spawn parallel for each argument and the inner parallel will spawn for each server.

Another solution is to write the ssh command yourself:

parallel ssh {1} /shared/loc/script.sh {2} ::: ${RH32} ${RH64} ::: param1 param2

limit spawned parallel processes and exit all upon failure of any

cat arguments | parallel --halt now,fail=1 my_prg

Alternatively:

parallel --halt now,fail=1 my_prg ::: $ALL_ARGS

GNU Parallel is designed so it will also kill remote jobs. It does that using process groups and heavy perl scripting on the remote server: https://www.gnu.org/software/parallel/parallel_design.html#The-remote-system-wrapper

download files parallely in a bash script

If you do not mind using xargs then you can:

xargs -I xxx -P 3 sleep xxx < sleep

and sleep is:

1
2
3
4
5
6
7
8
9

and if you watch the background with:

watch -n 1 -exec ps  --forest -g -p your-Bash-pid

(sleep could be your array of link ) then you will see that 3 jobs are run in parallel and when one of these three is completed the next job is added. In fact always 3 jobs are running till the end of array.

sample output of watch(1):

12260 pts/3    S+     0:00  \_ xargs -I xxx -P 3 sleep xxx
12263 pts/3 S+ 0:00 \_ sleep 1
12265 pts/3 S+ 0:00 \_ sleep 2
12267 pts/3 S+ 0:00 \_ sleep 3

xargs starts with 3 jobs and when one of them is finished it will add the next which bacomes:

12260 pts/3    S+     0:00  \_ xargs -I xxx -P 3 sleep xxx
12265 pts/3 S+ 0:00 \_ sleep 2
12267 pts/3 S+ 0:00 \_ sleep 3
12269 pts/3 S+ 0:00 \_ sleep 4 # this one was added


Related Topics



Leave a reply



Submit