Using find with -exec {}, is there a way to count the total?
This works:
$ find . -name "*.php" -exec chmod 755 {} \; -exec /bin/echo {} \; | wc -l
You have to include a second -exec /bin/echo
for this to work. If the find
command has no output, then wc
has no input to count lines for.
count how many files generated with find command
You can pass the output of find to the "word count" programm wc
find . -type f -name "script.log" -exec grep "finished without error" {} \;| wc -l
wc -l
writes the number of lines to stdout.
Use find, wc, and sed to count lines
You may also get the nice formatting from wc with :
wc `find -name '*.m'`
What is the best way to count find results?
Try this instead (require find
's -printf
support):
find <expr> -type f -printf '.' | wc -c
It will be more reliable and faster than counting the lines.
Note that I use the find
's printf
, not an external command.
Let's bench a bit :
$ ls -1
a
e
l
ll.sh
r
t
y
z
My snippet benchmark :
$ time find -type f -printf '.' | wc -c
8
real 0m0.004s
user 0m0.000s
sys 0m0.007s
With full lines :
$ time find -type f | wc -l
8
real 0m0.006s
user 0m0.003s
sys 0m0.000s
So my solution is faster =) (the important part is the real
line)
total size of group of files selected with 'find'
The command du
tells you about disk usage. Example usage for your specific case:
find rapidly_shrinking_drive/ -name "offender1" -mtime -1 -print0 | du --files0-from=- -hc | tail -n1
(Previously I wrote du -hs
, but on my machine that appears to disregard find
's input and instead summarises the size of the cwd.)
Count and remove old files using Unix find
You could just use bash within find:
find "$DIR_TO_CLEAN" -mtime +$DAYS_TO_SAVE -exec bash -c 'printf "Total: %d\n" $#; rm "$@"' _ {} +
Of course this can call bash -c …
more than once if the number of files found is larger than MAX_ARGS, and it also can overestimate the count if rm fails. But solving those problems gets messy:
find "$DIR_TO_CLEAN" -mtime +$DAYS_TO_SAVE -exec bash -c 'printf "count=0; for f; do rm "$f" && (( count++ )); done; printf "Total: %d\n" $count' _ {} +
This solution to avoid MAX_ARGS limits avoids find altogether. If you need it to be recursive, you'll have to use recursive globbing, which is only available in newer shells. (globstar
is a bash 4 feature.)
shopt -s globstar
# Assume DAYS_TO_SAVE reformatted to how touch -m expects it. (Exercise for the reader.)
touch -m "$DAYS_TO_SAVE" referencefile
count=0
for file in "$DIR_TO_CLEAN/"**/*; do
if [[ referencefile -nt "$file" ]]; then
rm "$file" && (( count++ ))
fi
done
printf 'Total: %d\n' "$count"
Here's an approach using find with printf (strictly compliant find doesn't have printf, but you can use printf as a standalone utility in that case).
find "$DIR_TO_CLEAN" -type -f -mtime "+$DAYS_TO_SAVE" -exec rm {} \; -printf '.' | wc -c
find "$DIR_TO_CLEAN" -type -f -mtime "+$DAYS_TO_SAVE" -exec rm {} \; -exec printf '.' \; | wc -c
How can I count all the lines of code in a directory recursively?
Try:
find . -name '*.php' | xargs wc -l
or (when file names include special characters such as spaces)
find . -name '*.php' | sed 's/.*/"&"/' | xargs wc -l
The SLOCCount tool may help as well.
It will give an accurate source lines of code count for whatever
hierarchy you point it at, as well as some additional stats.
Sorted output:
find . -name '*.php' | xargs wc -l | sort -nr
calculate total used disk space by files older than 180 days using find
@PeterT is right. Almost all these answers invoke a command (du) for each file, which is very resource intensive and slow and unnecessary. The simplest and fastest way is this:
find . -type f -mtime +356 -printf '%s\n' | awk '{total=total+$1}END{print total/1024}'
Linux: Using find and grep to find a keyword in files and count occurrences
You can get more efficiency if you avoid -exec
, which makes one fork
per file match. xargs
is a better choice here. So I would do something like this:
find myFolder -type f -print0 | xargs -0 grep KEYWORD
find myFolder -type f -print0 | xargs -0 grep KEYWORD | wc -l
The last one should be OK, at least with GNU find.
The -print0
and -0
ensure that filenames with spaces in them are handled correctly.
Note that grep -r` implies recursive grepping, but as you're only supplying one filename in each invocation it is redundant.
Related Topics
How to Get Original Destination Port of Redirected Udp Message
Option to Display Control Characters in Gedit
How to Capture Network Frames in a Kernel Module
Difference Between -Shared and -Wl,-Shared of the Gcc Options
How to Change the Soname of a Binary Directly
Simulate Network Latency on Specific Port Using Tc
Cannot --Enable-Pcregrep-Libbz2 Because Bzlib.H Was Not Found
Writing a Bash For-Loop with a Variable Top-End
Gdb Appears to Ignore Executable Capabilities
Why Should I Recompile an Entire Program Just for a Library Update
"Cd" Does Not Work in Shell Script
Using Perf to Monitor Raw Event Counters
How to Get Debugging Symbols Working in Linux Perf Tool Inside Docker Containers
Copy a Directory Structure with File Names Without Content
How to Redirect from Audio Output to Mic Input Using Pulseaudio