How to pipe output from grep to cp?
grep -l -r "TWL" --exclude=*.csv* | xargs cp -t ~/data/lidar/tmp-ajp2/
Explanation:
- grep
-l
option to output file names only - xargs to convert file list from the standard input to command line arguments
- cp
-t
option to specify target directory (and avoid using placeholders)
Is it possible to pipe the results of FIND to a COPY command CP?
Good question!
- why cant you just use | pipe? isn't that what its for?
You can pipe, of course, xargs
is done for these cases:
find . -iname "*.SomeExt" | xargs cp Destination_Directory/
- Why does everyone recommend the -exec
The -exec
is good because it provides more control of exactly what you are executing. Whenever you pipe there may be problems with corner cases: file names containing spaces or new lines, etc.
- how do I know when to use that (exec) over pipe | ?
It is really up to you and there can be many cases. I would use -exec
whenever the action to perform is simple. I am not a very good friend of xargs
, I tend to prefer an approach in which the find
output is provided to a while
loop, such as:
while IFS= read -r result
do
# do things with "$result"
done < <(find ...)
pass output as an argument for cp in bash
It would be:
cp `ls -SF | grep -v / | head -5` Directory
assuming that the pipeline is correct. The backticks substitute in the line the output of the commands inside it.
You can also make your tests:
cp `echo a b c` Directory
will copy all a
, b
, and c
into Directory
.
how to send output of 'ls-l' to cp command using pipes?
Output to file "test.txt":
ls -l > text.txt
Sumary of pipes:
> Save output to a file.
>> Append output to a file.
< Read input from a file.
2> Redirect error messages.
| Send the output from one program as input to another program.
How to copy files found with grep
Try this:
find . -type f -exec grep -q '^beginString' {} \; -exec cp -t /home/user/DestinationFolder {} +
or
grep -lir '^beginString' . | xargs cp -t /home/user/DestinationFolder
But if you want to keep directory structure, you could:
grep -lir '^beginString' . | tar -T - -c | tar -xpC /home/user/DestinationFolder
or if like myself, you prefer to be sure about kind of file you store (only file, no symlinks), you could:
find . -type f -exec grep -l '^beginString' {} + | tar -T - -c |
tar -xpC /home/user/DestinationFolder
and if your files names could countain spaces and/or special characters, use null terminated strings
for passing grep -l
output (arg -Z
) to tar -T
(arg --null -T
):
grep -Zlir '^beginString' . | xargs --null cp -t /home/user/DestinationFolder
or
find . -type f -exec grep -lZ '^beginString' {} + | tar --null -T - -c |
tar -xpC /home/user/DestinationFolder
How to copy files found with grep on OSX
Less efficient than cp -t
, but this works:
grep -lr "foo" --include=*.txt * 2>/dev/null |
xargs -I{} cp "{}" /path/to/targetdir
Explanation:
For filenames | xargs cp -t destination
, xargs
changes the incoming filenames into this format:
cp -t destination filename1 ... filenameN
i.e., it only runs cp
once (actually, once for every few thousand filenames -- xargs
breaks the command line up if it would be too long for the shell).
For filenames | xargs -I{} cp "{}" destination
, on the other hand, xargs
changes the incoming filenames into this format:
cp "filename1" destination
...
cp "filenameN" destination
i.e., it runs cp
once for each incoming filename, which is much slower. For a large number (e.g., >10k) of very small (e.g., <10k) files, I'd guess it could even be thousands of times slower. But it does work :)
PS: Another popular technique is use find
's exec
function instead of xargs
, e.g., https://stackoverflow.com/a/5241677/1563960
perform an operation for *each* item listed by grep
If I understand your specification, you want:
grep --null -l '<pattern>' directory/*.extension1 | \
xargs -n 1 -0 -I{} bash -c 'rm "$1" "${1%.*}.extension2"' -- {}
This is essentially the same as what @triplee's comment describes, except that it's newline-safe.
What's going on here?
grep
with --null
will return output delimited with nulls instead of newline. Since file names can have newlines in them delimiting with newline makes it impossible to parse the output of grep
safely, but null is not a valid character in a file name and thus makes a nice delimiter.
xargs
will take a stream of newline-delimited items and execute a given command, passing as many of those items (one as each parameter) to a given command (or to echo
if no command is given). Thus if you said:
printf 'one\ntwo three \nfour\n' | xargs echo
xargs
would execute echo one 'two three' four
. This is not safe for file names because, again, file names might contain embedded newlines.
The -0
switch to xargs
changes it from looking for a newline delimiter to a null delimiter. This makes it match the output we got from grep --null
and makes it safe for processing a list of file names.
Normally xargs
simply appends the input to the end of a command. The -I
switch to xargs
changes this to substitution the specified replacement string with the input. To get the idea try this experiment:
printf 'one\ntwo three \nfour\n' | xargs -I{} echo foo {} bar
And note the difference from the earlier printf | xargs
command.
In the case of my solution the command I execute is bash
, to which I pass -c
. The -c
switch causes bash to execute the commands in the following argument (and then terminate) instead of starting an interactive shell. The next block 'rm "$1" "${1%.*}.extension2"'
is the first argument to -c
and is the script which will be executed by bash
. Any arguments following the script argument to -c
are assigned as the arguments to the script. This, if I were to say:
bash -c 'echo $0' "Hello, world"
Then Hello, world
would be assigned to $0
(the first argument to the script) and inside the script I could echo
it back.
Since $0
is normally reserved for the script name I pass a dummy value (in this case --
) as the first argument and, then, in place of the second argument I write {}
, which is the replacement string I specified for xargs
. This will be replaced by xargs
with each file name parsed from grep
's output before bash
is executed.
The mini shell script might look complicated but it's rather trivial. First, the entire script is single-quoted to prevent the calling shell from interpreting it. Inside the script I invoke rm
and pass it two file names to remove: the $1
argument, which was the file name passed when the replacement string was substituted above, and ${1%.*}.extension2
. This latter is a parameter substitution on the $1
variable. The important part is %.*
which says
%
"Match from the end of the variable and remove the shortest string matching the pattern..*
The pattern is a single period followed by anything.
This effectively strips the extension, if any, from the file name. You can observe the effect yourself:
foo='my file.txt'
bar='this.is.a.file.txt'
baz='no extension'
printf '%s\n'"${foo%.*}" "${bar%.*}" "${baz%.*}"
Since the extension has been stripped I concatenate the desired alternate extension .extension2
to the stripped file name to obtain the alternate file name.
Related Topics
How to Copy All PDF Files from a Directory and Its Subdirectories to One Location
How to Use "Py" Instead of "Python" at the Command Line in Linux
How to Redirect the Output of an Application in Background to /Dev/Null
Implementing an Update/Upgrade System for Embedded Linux Devices
Linux Script to Kill Java Process
How to Show the Wget Progress Bar Only
What Exactly Is Sudo Bang Bang
Setting Environment Variable Globally Without Restarting Ubuntu
Is It Better to Use Git Grep Than Plain Grep If We Want to Search in Versioned Source Code
How to Determine If Lcd Monitor Is Turned on from Linux Command Line
Shell Script Error Expecting "Do"
Signing into Slack-Desktop Not Working on 4.23.0 64-Bit (Ubuntu)
Creating Symbolic Link: Protocol Error
Detect the Presence of a Device When It's Hot Plugged in Linux
Add Double Quotes Around Fields in Awk Script Output
Black Color Showing on Cmy Channels When Converted to Cmyk Using Ghostscript