How to Transfer Files from Local Machine to Virtual Folders Within an Azure Container (Using Blobxfer)

How to transfer files from local machine to virtual folders within an Azure container (using blobxfer)

Looking at the error, it is clear that the script is trying to create a container (pictures in your case) however the container already exists.

I quickly looked up the code on Github and it seems there's an argument called createcontainer. Default value for this argument is True. Try passing this argument in your script and initialize its value to False. Then the script shouldn't try to create the container.

Simple way to copy files onto Windows-based Azure container instance

As I know there is no way to copy files to the Windows-based Azure container instance except the command. The AzCopy command is OK. It's impossible that you want to do something on the host agent. You can do nothing with the ACI host agent. Additionally, the ACI is more suitable for a quick test and running of the images.

If you want to copy files and other controls on the containers, I recommend the AKS. You can run the Windows-based containers in the AKS with Windows nodes, and the Azure File volume is also available for the Windows containers. See the information here.

The fastest method to move tens of thousands of small files to Azure Storage container

Please try upgrading your blobxfer to 0.9.9.6. There were a few bugs with zero-byte files that were recently fixed.

Regarding your question with blobxfer, you should directly open issues on the GitHub page rather than on stackoverflow. Maintainers of the code will have an easier time looking at your issue and replying and/or fixing your issue with regard to that specific tool. If you are still encountering issues with blobxfer after upgrading to 0.9.9.6 then post an issue directly on the GitHub project page.

In general, as shellter has pointed out, for thousands of small files you should archive them first then upload the archive to achieve greater throughput.

Move files from Azure storage to a local directory

The issue was resolved using the following approach:

  1. Delete any existing mapped drive to the drive letter we are using with the net use <drive_letter> /delete command. This is done to make sure the drive was detached since the last time script ran
  2. Map the drive again using the net use command
  3. Copy all the files using robocopy
  4. After that delete all the files using del command
  5. Disconnect the drive now using the net use <drive_letter> /delete command

Upload multiple files in Azure Blob Storage from Linux

Thank you for your interest – There are two options to upload files in Azure Blobs from Linux:

  1. Setup and use XPlatCLI by following the steps below:

    • Install the OS X Installer from http://azure.microsoft.com/en-us/documentation/articles/xplat-cli/
    • Open a Terminal window and connect to your Azure subscription by either downloading and using a publish settings file or by logging in to Azure using an organizational account (find instructions here)
    • Create an environment variable AZURE_STORAGE_CONNECTION_STRING and set its value (you will need your account name and account key): “DefaultEndpointsProtocol=https;AccountName=enter_your_account;AccountKey=enter_your_key”
    • Upload a file into Azure blob storage by using the following command: azure storage blob upload [file] [container] [blob]
  2. Use one of the third party web azure storage explorers like CloudPortam: http://www.cloudportam.com/.
    You can find the full list of azure storage explorers here: http://blogs.msdn.com/b/windowsazurestorage/archive/2014/03/11/windows-azure-storage-explorers-2014.aspx.



Related Topics



Leave a reply



Submit