Multithreading in Bash

Multithreading in Bash

Sure, just add & after the command:

read_cfg cfgA &
read_cfg cfgB &
read_cfg cfgC &
wait

all those jobs will then run in the background simultaneously. The optional wait command will then wait for all the jobs to finish.

Each command will run in a separate process, so it's technically not "multithreading", but I believe it solves your problem.

bash while loop threading

You can send tasks to the background by &
If you intend to wait for all of them to finish you can use the wait command:

process_to_background &
echo Processing ...
wait
echo Done

You can get the pid of the given task started in the background if you want to wait for one (or few) specific tasks.

important_process_to_background &
important_pid=$!
while i in {1..10}; do
less_important_process_to_background $i &
done

wait $important_pid
echo Important task finished

wait
echo All tasks finished

On note though: the background processes can mess up the output as they will run asynchronously. You might want to use a named pipe to collect the output from them.

edit

As asked in the comments there might be a need for limiting the background processes forked. In this case you can keep track of how many background processes you've started and communicate with them through a named pipe.

mkfifo tmp # creating named pipe

counter=0
while read ip
do
if [ $counter -lt 10 ]; then # we are under the limit
{ check $ip; echo 'done' > tmp; } &
let $[counter++];
else
read x < tmp # waiting for a process to finish
{ check $ip; echo 'done' > tmp; } &
fi
done
cat /tmp > /dev/null # let all the background processes end

rm tmp # remove fifo

Multithreading in bash scripting

You should be able to do this relatively easily. Don't try to background each command, but instead put the body of your while loop into a subshell and background that. That way, your commands (which clearly depend on each other) run sequentially, but all the lines in the file can be process in parallel.

while IFS= read -r line; 
do
(
HTTP_RESPONSE=$(curl -L -s -w "HTTPSTATUS:%{http_code}\\n" -H "X-Gitlab-Event: Push Hook" -H 'X-Gitlab-Token: '$SECRET_KEY --insecure $line 2>&1)
HTTP_BODY=$(echo $HTTP_RESPONSE | sed -e 's/HTTPSTATUS\:.*//g')
HTTP_STATUS=$(echo $HTTP_RESPONSE | tr -d '\n' | sed -e 's/.*HTTPSTATUS://')

save_log "$HTTP_STATUS" "$HTTP_BODY" ) &
done < $FILE_NAME

Multithreading semaphore for bash script (sub-processes)

Following recommendation by @Mark Setchell, using GNU Parallel to replace the loop (in a simulated cron environment (see https://stackoverflow.com/a/2546509/8236733)) with

bcpexport() {
filename=$1
TO_SERVER_ODBCDSN=$2
DB=$3
TABLE=$4
USER=$5
PASSWORD=$6
RECOMMEDED_IMPORT_MODE=$7
DELIMITER=$8 # DO NOT use format like "'\t'", nested quotes seem to cause hard-to-catch error
<same code from original loop>
}
export -f bcpexport
parallel -j 10 bcpexport \
::: $DATAFILES/$TARGET_GLOB \
::: "$TO_SERVER_ODBCDSN" \
::: $DB \
::: $TABLE \
::: $USER \
::: $PASSWORD \
::: $RECOMMEDED_IMPORT_MODE \
::: $DELIMITER

to run at most 10 threads at a time, where $DATAFILES/$TARGET_GLOB is a glob string to return all of the files in the desired dir. (eg. "$storagedir/tsv/*.tsv") that we want to go through (and adding the remaining fixed args with each of the elements returned by that glob as the remaining parallel inputs shown) (The $TO_SERVER_ODBCDSN variable is actually "-D -S <some ODBC DSN>", so needed to add quotes to pass as single arg). So if the $DATAFILES/$TARGET_GLOB glob returns files A, B, C, ..., we end up running the commands

bcpexport A "$TO_SERVER_ODBCDSN" $DB ...
bcpexport B "$TO_SERVER_ODBCDSN" $DB ...
bcpexport C "$TO_SERVER_ODBCDSN" $DB ...
...

in parallel. An additionally nice thing about using parallel is

GNU parallel makes sure output from the commands is the same output as you would get had you run the commands sequentially.

Multi-threaded BASH programming - generalized method?


#adjust these as required
args_per_proc=1 #1 is fine for long running tasks
procs_in_parallel=4

xargs -n$args_per_proc -P$procs_in_parallel povray < list

Note the nproc command coming soon to coreutils will auto determine
the number of available processing units which can then be passed to -P



Related Topics



Leave a reply



Submit