Get Files with Wget and Sftp

get files with wget and SFTP

To copy keys, I find that ssh-copy-id user@machine does all you need. As long as you have a key on your machine that is.

I also use scp quite a bit, where you specify scp user@machine:filespec whereto (it is using the same encryption and authentication mechanism as sftp).

Hope this helps.

What is the command to get the .listing file from SFTP server using cURL command

This is what I found on http://curl.haxx.se/docs/faq.html#How_do_I_get_an_FTP_directory_li

If you end the FTP URL you request with a slash, libcurl will provide you with a directory listing of that given directory. You can also set CURLOPT_CUSTOMREQUEST to alter what exact listing command libcurl would use to list the files.

The follow-up question that tend to follow the previous one, is how a program is supposed to parse the directory listing. How does it know what's a file and what's a dir and what's a symlink etc. The harsh reality is that FTP provides no such fine and easy-to-parse output. The output format FTP servers respond to LIST commands are entirely at the server's own liking and the NLST output doesn't reveal any types and in many cases don't even include all the directory entries. Also, both LIST and NLST tend to hide unix-style hidden files (those that start with a dot) by default so you need to do "LIST -a" or similar to see them.

Thanks & Regards,

Alok

curl to SFTP and list files in directory

Yes: end the URL with a trailing slash, to indicate to curl that it is in fact a directory! Like this:

curl -k sftp://url.test.com/test_folder/ --user "username:password"

Using wget to recursively fetch a directory with arbitrary files in it

You have to pass the -np/--no-parent option to wget (in addition to -r/--recursive, of course), otherwise it will follow the link in the directory index on my site to the parent directory. So the command would look like this:

wget --recursive --no-parent http://example.com/configs/.vim/

To avoid downloading the auto-generated index.html files, use the -R/--reject option:

wget -r -np -R "index.html*" http://example.com/configs/.vim/

Download multiple files in different SFTP directories to local

You could use a process pool to open multiple sftp connections and download in parallel. For example,

from paramiko import SSHClient
from multiprocessing import Pool

def download_init(host):
global client, sftp
client = SSHClient()
client.load_system_host_keys()
client.connect(host)
sftp = ssh_client.open_sftp()

def download_close(dummy):
client.close()

def download_worker(params):
local_path, remote_path = *params
sftp.get(remote_path, local_path)

list_of_local_and_remote_files = [
["/client/files/folder1/img11", "/IMAGES/folder1/img11"],
]

def downloader(files):
pool_size = 8
pool = Pool(8, initializer=download_init,
initargs=["sftpserver.example.com"])
result = pool.map(download_worker, files, chunksize=10)
pool.map(download_close, range(pool_size))

if __name__ == "__main__":
downloader(list_of_local_and_remote_files)

Its unfortunate that Pool doesn't have a finalizer to undo what was set in the initializer. Its not usually necessary - the exiting process is cleanup enough. In the example I just wrote a separate worker function that cleans things up. By having 1 work item per pool process, they each get 1 call.

Is it possible to download a file from one server (sftp) and upload it to my server using php?

Go here and download what you need: http://phpseclib.sourceforge.net/

UPDATE

  • FOR SFTP

Then in your script:

<?php
include('Net/SFTP.php');

$url = 'http://www.downloadsite.com';
$fileToDownload = "yourCSV.csv";
$cmd = "wget -q \"$url\" -O $fileToDownload";
exec($cmd);

$sftp = new Net_SFTP('www.uploadsite.com');
if (!$sftp->login('username', 'password')) {
exit('Login Failed');
}

echo $sftp->pwd() . "\r\n";
$sftp->put('remote.file.csv', 'yourCSV.csv', NET_SFTP_LOCAL_FILE);
print_r($sftp->nlist());
?>

If you need to connect to a second server for download:

$sftp2 = new Net_SFTP('www.serverFromWhichToDownload.com');
if (!$sftp2->login('username', 'password')) {
exit('Login Failed');
}

echo $sftp2->pwd() . "\r\n";
$sftp2->get('localFileName.csv', 'remoteFileName.csv');
print_r($sftp2->nlist());

Read the docs for further help and examples: http://phpseclib.sourceforge.net/documentation/net.html#net_sftp_get

To Log what your connection is doing if it fails, etc. use this:

include('Net/SSH2.php');
define('NET_SSH2_LOGGING', true);
$ssh = new Net_SSH2('www.domain.tld');
$ssh->login('username','password');
echo $ssh->getLog();
  • FOR FTP upload - SO has gone crazy, does not want to format my code, but here it is anyway:

$file = 'somefile.txt';
$remote_file = 'readme.txt';
$conn_id = ftp_connect($ftp_server);
$login_result = ftp_login($conn_id, $ftp_user_name, $ftp_user_pass);
if (ftp_put($conn_id, $remote_file, $file, FTP_ASCII)) {
echo "successfully uploaded $file\n";
} else {
echo "There was a problem while uploading $file\n";
}
ftp_close($conn_id);


Related Topics



Leave a reply



Submit