Deleting Old Files Using Crontab

Delete folder older than 30 minutes with Cron

I actually found out that the best method is to divide the commands in Cron in 2 parts, and use the -delete argument

Code

30 * * * * sudo find /my/folder/* -type f -mmin +30 -delete && sudo find /my/folder/* -type d -empty -mmin +30 -delete 

Explanations

30 * * * *: execute every 30mn all the time

sudo find /my/folder/* -type f -mmin +45 -delete : delete all files and subfiles that are older than 45 minutes

&& : do only if first command has successfully run

sudo find /my/folder/* -type d -empty -mmin +45 -delete : delete all empty folders that are older than 45 minutes

Working on Ubuntu 16.04

Cron Job to auto delete folder older than 7 days Linux

For example, the description of crontab for deleting files older than 7 days under the /path/to/backup/ every day at 4:02 AM is as follows.

02 4 * * * find /path/to/backup/* -mtime +7 -exec rm {} \;

Please make sure before executing rm whether targets are intended files. You can check the targets by specifying -ls as the argument of find.

find /path/to/backup/* -mtime +7 -ls

mtime means the last modification timestamp and the results of find may not be the expected file depending on the backup method.

deleting old files using crontab

Just create another cron:

0 3 * * * find $HOME/db_backups -name "db_name*.sql" -mtime +30 -exec rm {} \; >> $HOME/db_backups/purge.log 2>&1

It will find all backups older than 30 days and delete them.

Deleting old files using cron job in Linux

Just look for *.gz files and delete them.

find /var/log/tomcat8/ -name '*.gz' -mindepth 1 -mtime +1 -delete

Before deleting, just list the files to make sure you are deleting the correct ones.

find /var/log/tomcat8/ -name '*.gz' -mindepth 1 -mtime +1 -print

How to delete folders that are older than a day? (Cron Job)

This should do it:

find /path/to/dir -maxdepth 0 -ctime +1 -exec rm -fr {} +

But be careful, and test it first outside of cron, without the -exec part, so you don't delete something else by accident.

Remove log files using cron job

Use wildcard. And just put it in your crontab use the crontab -e option to edit your crontab jobs.

See example:

* * * * *  find  /path/to/*.log -mtime +7 -exec rm -f {} \; 

Just to increment the answer check this nice article on how to work with your crontab ! in Linux .

CentOS: delete X-days old files using cron

I use for loops and echo statements for debugging purposes and also to make sure I don't delete anything by mistake.

So try this:

for filename in `find /path/to/files -type f -mtime +10`; do echo "rm $filename"; done

That will list out what it plans to do. Check it, make sure it looks good, then run by removing the echo statement

for filename in `find /path/to/files -type f -mtime +10`; do rm $filename; done

If it doesn't work, then you can output the above to a file and run it as a script

for filename in `find /path/to/files -type f -mtime +10`; do echo "rm $filename"; done > myremove.sh

Then run it

sh -x ./myremove.sh 

And check the output above for errors.

Once you got it bug free, you can add to cron as a single command or in a script



Related Topics



Leave a reply



Submit