Run All Shell Scripts in Folder

Run all shell scripts in folder

Use this:

for f in *.sh; do
bash "$f"
done

If you want to stop the whole execution when a script fails:

for f in *.sh; do
bash "$f" || break # execute successfully or break
# Or more explicitly: if this execution fails, then stop the `for`:
# if ! bash "$f"; then break; fi
done

It you want to run, e.g., x1.sh, x2.sh, ..., x10.sh:

for i in `seq 1 10`; do
bash "x$i.sh"
done

To preserve exit code of failed script (responding to @VespaQQ):

#!/bin/bash
set -e
for f in *.sh; do
bash "$f"
done

Run N Shell Scripts in Folder

find the scripts, get the head, then execute with xargs.

find . -name '*.sh' | head -n 10 | xargs -n1 sh

You can run the scripts in parallel with xargs with a simple -P0 option. You can script the xargs with some xargs sh -c 'bash "$@" -H || exit 125' -- to make xargs exit with nonzero status or immediately after any of the scripts fail to run or something.

If you feel like unfamiliar with xargs, just do a simple while read loop:

find . -name '*.sh' | head -n 10 | 
while IFS= read -r script; do
bash "$script" -H || break
done

And in parallel you have to get out of the pipe subshell:

while IFS= read -r script; do
bash "$script" -H || break &
done < <(
find . -name '*.sh' | head -n 10
)
wait # for all the childs

or wait for childs in the subshell itself:

find . -name '*.sh' | head -n 10 |
{
while IFS= read -r script; do
bash "$script" -H || break &
done
wait
}

Shell script to list all files in a directory

Try this Shellcheck-clean pure Bash code for the "further plan" mentioned in a comment:

#! /bin/bash -p

# List all subdirectories of the directory given in the first positional
# parameter. Include subdirectories whose names begin with dot. Exclude
# symlinks to directories.

shopt -s dotglob
shopt -s nullglob
for d in "$1"/*/; do
dir=${d%/} # Remove trailing slash
[[ -L $dir ]] && continue # Skip symlinks
printf '%s\n' "$dir"
done
  • shopt -s dotglob causes shell glob patterns to match names that begin with a dot (.). (find does this by default.)
  • shopt -s nullglob causes shell glob patterns to expand to nothing when nothing matches, so looping over glob patterns is safe.
  • The trailing slash on the glob pattern ("$1"/*/) causes only directories (including symlinks to directories) to be matched. It's removed (dir=${d%/}) partly for cleanliness but mostly to enable the test for a symlink ([[ -L $dir ]]) to work.
  • See the accepted, and excellent, answer to Why is printf better than echo? for an explanation of why I used printf instead of echo to print the subdirectory paths.

Execute multiple shell scripts of same name under several sub-directories

This would find all deploy.sh scripts in the subdirectories to wherever the current directory is - and execute them.

Without doing cd down into the subdirectories:

for dep in */deploy.sh; do "$dep"; done

Doing cd down into each subdirectory:

for dep in */deploy.sh; do (cd "$(dirname "$dep")"; ./deploy.sh) done

Execute command on all files in a directory

The following bash code will pass $file to command where $file will represent every file in /dir

for file in /dir/*
do
cmd [option] "$file" >> results.out
done

Example

el@defiant ~/foo $ touch foo.txt bar.txt baz.txt
el@defiant ~/foo $ for i in *.txt; do echo "hello $i"; done
hello bar.txt
hello baz.txt
hello foo.txt

how can I use all scripts from custom folder like native shell commands in macOS

You need to

  1. Name each script file what you'd like to call it when you run it (without the extension), e.g. copymany and movemany
  2. Place the copymany and movemany files in a folder, perhaps ~/bin
  3. Add the folder to your $PATH environment variable, e.g. export PATH=$PATH:$HOME/bin, in your .bashrc or .zshrc

How to run a script for all files in Linux directory?


#!/bin/bash

FILES=`ls *.*`
for FILE in $FILES
do
script_run "[--input ${FILE}] [WriterStd --output tmp_output_${FILE}.txt]"
cat tmp_output_${FILE}.txt >> output_${FILE}_interp.txt
done

btw what's with this strange annotation [--input ${FILE}] ? Does your script explicitly requires a format like that?

run bash script for each files in folder

The easiest is probably to keep the script as it is and to use a bash loop to process all files in the input directory. Let's assume:

  • the input directory is /my/video/files,
  • you want to store all outputs in directory /some/where,
  • the script you show is in /else/where/myscript.sh,
  • you want to process all files in the input directory.

Just open a terminal where bash is the interactive shell and type:

shopt -s nullglob
chmod +x /else/where/myscript.sh
mkdir -p /some/where
cd /my/video/files
for f in *; do
/else/where/myscript.sh "$f" "/some/where/$f"
done
shopt -u nullglob

Explanations:

  • shopt -s nullglob enables the nullglob option. Without this, if there are no files at all in the input directory, there would still be one iteration of the loop with f=*. shopt -u nullglob disables it when we are done.
  • chmod +x /else/where/myscript.sh makes your script executable, just in case it was not already.
  • mkdir -p /some/where creates the output directory, just in case it did not exist yet.
  • cd /my/video/files changes the current directory to the input directory in which you have your video files.
  • for f in *; do loops over all files in the current directory (this is what the * stands for). In each iteration variable f is assigned the current file name.
  • /else/where/myscript.sh "$f" "/some/where/$f" executes your script with two parameters: the name of the input file and the name of the output file, both quoted with double quotes to prevent word splitting.

Note: if all files are not video files you can be more specific:

for f in *.mkv *.mp4 *.avi; do
...

Of course, for easier reuse, you can also create a new shell script file with all this.



Related Topics



Leave a reply



Submit