How to Copy The Top 10 Most Recent Files from One Directory to Another

How to copy the top 10 most recent files from one directory to another?

ls -lt *.htm | head -10 | awk '{print "cp " $9 " ../Test/"$9}' | sh

Batch file to copy 2 most recent files into another folder

Your description is a bit upside down.

It's a bit clumsy to first copy files to the destination and then delete all except the two new ones.

  • You could evaluate the two files and remember the names ( in an array )
  • delete all files in dest
  • and finally copy the remembered files

Impertinently stealing npocmaka's code as a base:

@Echo off
setlocal enableDelayedExpansion
set "source=c:\source_folder"
set "target=c:\target"

PushD "%source%"
set "counter=0"
for /f "tokens=* delims=" %%A in ('dir "*.bak" /b /o:-d /t:w') do (
set /a counter+=1
Set Copy[!counter!]=copy "%%~fA" "%target%\"
if !counter! equ 2 goto :break
)
:break
Del /Q "%target%\*"
For /L %%C in (1,1,%counter%) Do !Copy[%%C]!
PopD

Copy last 5 updated files in HDFS Hadoop to target folder in bash

the below command with awk & xargs solves the problem.

hadoop fs -ls -R /dev/hadoop/hdata/test | awk '{print $6, $7, $8}'|sort -nr| head -5| cut -d" " -f3- | xargs  -I{} hadoop fs -cp '{}' /dev/hadoop/hdata/test1

complete command validation:


#input files available
[devuser@DATANODEUK03 HADOOP]$ hadoop fs -ls /dev/hadoop/hdata/test
Found 8 items
-rw-r----- 3 devuser uk1-dna-haas_dev 0 2020-08-06 04:51 /dev/hadoop/hdata/test/test1.txt
-rw-r----- 3 devuser uk1-dna-haas_dev 21 2020-08-06 04:56 /dev/hadoop/hdata/test/test10.txt
-rw-r----- 3 devuser uk1-dna-haas_dev 21 2020-08-06 05:00 /dev/hadoop/hdata/test/test15.txt
-rw-r----- 3 devuser uk1-dna-haas_dev 21 2020-08-06 05:01 /dev/hadoop/hdata/test/test16.txt
-rw-r----- 3 devuser uk1-dna-haas_dev 21 2020-08-06 05:04 /dev/hadoop/hdata/test/test17.txt
-rw-r----- 3 devuser uk1-dna-haas_dev 21 2020-08-06 05:04 /dev/hadoop/hdata/test/test18.txt
-rw-r----- 3 devuser uk1-dna-haas_dev 0 2020-08-06 04:51 /dev/hadoop/hdata/test/test2.txt
-rw-r----- 3 devuser uk1-dna-haas_dev 0 2020-08-06 04:53 /dev/hadoop/hdata/test/test3.txt

#command to get latest 5 files in a folder recursively & copy into another folder
hadoop fs -ls -R /dev/hadoop/hdata/test | awk '{print $6, $7, $8}'|sort -nr| head -5| cut -d" " -f3- | xargs -I{} hadoop fs -cp '{}' /dev/hadoop/hdata/test1

#copy validation in HDFS
[devuser@DATANODEUK03 HADOOP]$ hadoop fs -ls /dev/hadoop/hdata/test1

Found 5 items
-rw-r----- 3 devuser uk1-dna-haas_dev 21 2020-08-06 05:05 /dev/hadoop/hdata/test1/test10.txt
-rw-r----- 3 devuser uk1-dna-haas_dev 21 2020-08-06 05:05 /dev/hadoop/hdata/test1/test15.txt
-rw-r----- 3 devuser uk1-dna-haas_dev 21 2020-08-06 05:04 /dev/hadoop/hdata/test1/test16.txt
-rw-r----- 3 devuser uk1-dna-haas_dev 21 2020-08-06 05:04 /dev/hadoop/hdata/test1/test17.txt
-rw-r----- 3 devuser uk1-dna-haas_dev 21 2020-08-06 05:04 /dev/hadoop/hdata/test1/test18.txt

How to copy the latest 2 directories from one directory to another?

How about this:

echo "$(/bin/ls -d1tr */ | tail -2)" | xargs -I % sh -c 'cp -R "%" tmp'

Sort the output of ls by timestamp, taken one entry at a time and in reverse order, listing only directories within the current directory. Take the last 2 output lines. Pass the result to xargs substituting the names into the shell command to copy the directories into another directory called tmp.

Update:
add some quotes to fix problems with filenames containing spaces.

Copy a file into a specific subfolder

Try this (simpler) approach:

@echo off

FOR /f "usebackq tokens=*" %%f IN (`DIR /s /b "subfolder 2"`) DO (
ECHO Copying text.txt from . to "%%f"
COPY test.txt "%%f"
)

More help:

  • FOR /? which explain the use off FOR.
  • tokens=* puts the complete return path from DIR into 1 variable (which is needed because of the space in the filename)

output:

D:\TEMP>HereIambatchfile.bat
Copying text.txt from . to "D:\TEMP\mainfolder 1\subfolder 2"
1 file(s) copied.
Copying text.txt from . to "D:\TEMP\mainfolder 2\subfolder 2"
1 file(s) copied.

D:\TEMP>


Related Topics



Leave a reply



Submit