How to delete all files older than 3 days when Argument list too long ?
To delete all files and directories within the current directory:
find . -mtime +3 | xargs rm -Rf
Or alternatively, more in line with the OP's original command:
find . -mtime +3 -exec rm -Rf -- {} \;
Argument list too long error for rm, cp, mv commands
The reason this occurs is because bash actually expands the asterisk to every matching file, producing a very long command line.
Try this:
find . -name "*.pdf" -print0 | xargs -0 rm
Warning: this is a recursive search and will find (and delete) files in subdirectories as well. Tack on -f
to the rm command only if you are sure you don't want confirmation.
You can do the following to make the command non-recursive:
find . -maxdepth 1 -name "*.pdf" -print0 | xargs -0 rm
Another option is to use find's -delete
flag:
find . -name "*.pdf" -delete
CMD delete files older than X days in specific folders
Nobody answered me but I've found the solution at the problem..
I post the snippet, maybe it will be useful for somebody.
for /d %%a in (C:\AwesomeSoftware\*) do (
for /d %%x in (%%a\Data\*) do (forfiles /p "%%x\Temp" /s /m *.* /D -7 /C "cmd /c del @path")
)
Deleting files after 7 days not working
From man find
:
-atime n
File was last accessed n*24 hours ago. When find figures out how many
24-hour periods ago the file was last accessed, any fractional part is
ignored, so to match -atime +1, a file has to have been accessed at least
two days ago.
This means that your command actually deletes files that were accessed 8 or more days ago.
Since the time now is
$ date
Tue Nov 7 10:29:29 PST 2017
find
will require files need to be older than:
$ date -d 'now - 8 days'
Mon Oct 30 11:29:05 PDT 2017
In other words, leaving some files from Oct 30 is expected and documented behavior.
To account for find
rounding down, simply use -mtime +6
instead.
How to use libiconv correctly in C so that it would not report Arg list too long ?
According to the man page, you get E2BIG
when there's insufficient room at *outbuf
.
I think the fifth argument should be a number of bytes.
wchar_t *utf8_to_wstr(const char *src) {
iconv_t cd = iconv_open("wchar_t", "UTF-8");
if (cd == (iconv_t)-1)
goto Error1;
size_t src_len = strlen(src); // In bytes, excludes NUL
size_t dst_len = sizeof(wchar_t) * src_len; // In bytes, excludes NUL
size_t dst_size = dst_len + sizeof(wchar_t); // In bytes, including NUL
wchar_t *buf = malloc(dst_size);
if (!buf)
goto Error2;
wchar_t *dst = buf;
if (iconv(cd, &(char*)src, &src_len, &(char*)dst, &dst_len) == (size_t)-1)
goto Error3;
*dst = L'\0';
iconv_close(cd);
return buf;
Error3:
free(buf);
Error2:
iconv_close(cd);
Error1:
return NULL;
}
Batch file to delete files older than N days
Enjoy:
forfiles -p "C:\what\ever" -s -m *.* -d <number of days> -c "cmd /c del @path"
See forfiles
documentation for more details.
For more goodies, refer to An A-Z Index of the Windows XP command line.
If you don't have forfiles
installed on your machine, copy it from any Windows Server 2003 to your Windows XP machine at %WinDir%\system32\
. This is possible since the EXE is fully compatible between Windows Server 2003 and Windows XP.
Later versions of Windows and Windows Server have it installed by default.
For Windows 7 and newer (including Windows 10):
The syntax has changed a little. Therefore the updated command is:
forfiles /p "C:\what\ever" /s /m *.* /D -<number of days> /C "cmd /c del @path"
Remove files older than x days in a loop without using find unix
You can create a temporary file with a timestamp of X days ago and then inside your for
loop compare each file's timestamp with that file to decide whether to delete it or not.
n=60 # number of days
ref="/tmp/$$.tmp" # temporary filename
touch -t $(date -d "-$n days" '+%Y%m%d%H%M.%S') "$ref"
for f in /path/tomy/directory/*.rpt; do
[[ -f $f && $ref -nt $f ]] && echo rm "$f"
done
rm "$ref"
Once you're satisfied with the output, remove echo
before rm
.
Delete all directories and its files older than a 60 days regardless of whether the directory is empty or not
Your issue is because find
is deleting its search base directories before it has finished iterating these, so calling your rm -rf
on already deleted entries.
This is easily fixed by adding the -depth
option.
Also, you should really end the rm
options with a double dash --
, to prevent having arguments provided by the find
command, to be interpreted as options arguments by the rm
command.
find /mnt/games/codes/ -depth -mtime '+60' -type d -exec rm -rf -- {} \;
Related Topics
Linux Sed Command - Using Variable with Backslash
How to Grep One String Occuring Multiple Times from Same File
How to Use Nohup to Run Process as a Background Process in Linux
What Exactly Is Sudo Bang Bang
Difference Between Netstat and Ss in Linux
Unix Command to List Files Containing String But *Not* Containing Another String
How to Count Occurrences of a Word in All the Files of a Directory
How to Automatically Pipe to Less If the Result Is More Than a Page on My Shell
Bash: /Bin/Tar: Argument List Too Long When Compressing Many Files with Tar
Trailing Arguments with Find -Exec {} +
Linux Script Extract Information from Excel to Create Users
How to Get Diff Between All Files Inside 2 Folders That Are on the Web
Internals of a Linux System Call
How to Pass Argument in Expect Through the Command Line in a Shell Script
How to View Log Files in Linux and Apply Custom Filters While Viewing
Run a Shell Script in New Terminal from Current Terminal
Docker: Are You Trying to Connect to a Tls-Enabled Daemon Without Tls