Shell Script to Count Files, Then Remove Oldest Files

Shell script to count files, then remove oldest files

Try this:

ls -t | sed -e '1,10d' | xargs -d '\n' rm

This should handle all characters (except newlines) in a file name.

What's going on here?

  • ls -t lists all files in the current directory in decreasing order of modification time. Ie, the most recently modified files are first, one file name per line.
  • sed -e '1,10d' deletes the first 10 lines, ie, the 10 newest files. I use this instead of tail because I can never remember whether I need tail -n +10 or tail -n +11.
  • xargs -d '\n' rm collects each input line (without the terminating newline) and passes each line as an argument to rm.

As with anything of this sort, please experiment in a safe place.

Bash script that remove oldest files recursively from a directory if free size is over a threshold

Your script will delete every file in $CACHEDIR if /dev/sda1 is more than 95% full.
Do something like this:

#!/bin/bash

DIRECTORY="/path/to/your/directory"
CAPACITY=95
while [[ $(df $DIRECTORY | awk 'NR==2 && gsub("%","") {print$5}') -ge $CAPACITY ]];do
rm -rf $(find $DIRECTORY -mindepth 1 -printf '%T+ %p\n' | sort | awk 'NR==1 {print$2}')
done

You can run this script on crontab, or do a while loop and daemonize it with systemd so it keep running in the background and delete files every time your partition reaches 95%.

Explaining:

df $DIRECTORY tracks the directory partition and print it's information.

awk 'NR==2 && gsub("%","") {print$5}' print the second line (relevant one, first is heading), remove the percent sign and print the column 5 ("Use%" column).

rm -rf removes the result of the following command.

find $DIRECTORY -mindepth 1 -printf '%T+ %p\n' | sort | awk 'NR==1 {print$2}' find the $DIRECTORY and print the modify date of the files, then sort it and print only second field (filename) of the first line (oldest one).

Count and remove old files using Unix find

You could just use bash within find:

find "$DIR_TO_CLEAN" -mtime +$DAYS_TO_SAVE -exec bash -c 'printf "Total: %d\n" $#; rm "$@"' _ {} +

Of course this can call bash -c … more than once if the number of files found is larger than MAX_ARGS, and it also can overestimate the count if rm fails. But solving those problems gets messy:

find "$DIR_TO_CLEAN" -mtime +$DAYS_TO_SAVE -exec bash -c 'printf "count=0; for f; do rm "$f" && (( count++ )); done; printf "Total: %d\n" $count' _ {} +

This solution to avoid MAX_ARGS limits avoids find altogether. If you need it to be recursive, you'll have to use recursive globbing, which is only available in newer shells. (globstar is a bash 4 feature.)

shopt -s globstar
# Assume DAYS_TO_SAVE reformatted to how touch -m expects it. (Exercise for the reader.)
touch -m "$DAYS_TO_SAVE" referencefile
count=0
for file in "$DIR_TO_CLEAN/"**/*; do
if [[ referencefile -nt "$file" ]]; then
rm "$file" && (( count++ ))
fi
done
printf 'Total: %d\n' "$count"

Here's an approach using find with printf (strictly compliant find doesn't have printf, but you can use printf as a standalone utility in that case).

find "$DIR_TO_CLEAN" -type -f -mtime "+$DAYS_TO_SAVE" -exec rm {} \; -printf '.' | wc -c
find "$DIR_TO_CLEAN" -type -f -mtime "+$DAYS_TO_SAVE" -exec rm {} \; -exec printf '.' \; | wc -c

Deleting oldest files from a subdirectory in a directory

If you are trying to delete the oldest modified file (not created), then you can use this:

rm "$(ls -t | tail -1)"

BASH Script to Remove old files, and create a text file containing the count and size of total files deleted.

I would use the below script, say deleter.sh for the purpose :

#!/bin/bash
myfunc()
{
local totalsize=0
echo " Removing files listed below "
echo "${@}"
sizes=( $(stat --format=%s "${@}") ) #storing sizes in an array.
for i in "${sizes[@]}"
do
(( totalsize += i )) #calculating total size.
done
echo "Total space to be freed : $totalsize bytes"
rm "${@}"
if [ $? -eq 0 ] #$? is the return value
then
echo "All files deleted"
else
echo "Some files couldn't be deleted"
fi
}
export -f myfunc
find "$1" -type f -not -name "*deleter.sh" -mtime +60\
-exec bash -c 'myfunc "$@"' _ {} +
# -not -name "*deleter.sh" to prevent self deletion
# Note -mtime +60 for files older than 60 days.

Do

chmod +x ./deleter.sh

And run it as

./deleter '/path/to/your/directory'

References

  1. Find [ manpage ] for more info.
  2. stat --format=%s gives size in bytes which we store in an array. See [ stat ] manpage.

feedback appreciated

Using shell script to sort files then delete old files

You can filter 20 filenames out with

awk 'NR > 20' 

Full command:

find ...... | awk 'NR > 20' | xargs -r  rm

For example

seq 30 | xargs -i echo 'file{}' | awk 'NR > 20' | xargs  -r rm

Bash script - delete old files

You can try this solution:

# Purpose: This step is used to Purge 7 days old files
export PROJECT_LOG="${PROJECT_HOME}/log";
export APP_MAINT_LOG="APP.log"
export LOG_RETAIN_DUR=7
echo "Maintenance Job Started" > "${APP_MAINT_LOG}"
echo "=========================================================================" >> "${APP_MAINT_LOG}"
echo "${LOG_RETAIN_DUR} Day(s) Old Log Files..." >> "${APP_MAINT_LOG}"
echo "=========================================================================" >> "${APP_MAINT_LOG}"
find "${PROJECT_LOG}" -mtime +"${LOG_RETAIN_DUR}" -type f -exec ls -1 {} \; >> "${APP_MAINT_LOG}"
#find "${PROJECT_LOG}" -mtime +"${LOG_RETAIN_DUR}" -type f -exec rm -rf {} \;
echo "=========================================================================" >> "${APP_MAINT_LOG}"
echo "Maintenance Job Completed" >> "${APP_MAINT_LOG}"
cat "${APP_MAINT_LOG}"

Note: I have commented the Remove file line, so that you can check and run !



Related Topics



Leave a reply



Submit