Load local file(s) to Azure Blob Storage using Azure Data Factory

Caio Moreno
3 min readFeb 1, 2019

--

Data is definitely not only hosted in the cloud. As we data geeks know, most of the time we have to load file(s) from local machines to the cloud. Let’s not focus on questions like: where should my data be located, etc.

Just think that for some reason you have to load file(s) from a local computer to Azure.

Azure Data Factory Pipeline

In this article, I will demo how to use Azure Data Factory to load file(s) from my local Windows computer to Azure Blob Storage.

There is no magic, follow the steps:

  1. Create a folder in your local windows computer or windows server;
  2. Move the files you want to upload to this folder, in my case I created a folder called C:\InputFilesToADF
  3. Create an Azure Data Factory pipeline and config the Copy Data Activity.
  4. Make sure you install the Microsoft Azure Data Factory Integration Runtime.
  5. Run the pipeline and see your file(s) loaded to Azure Blob Storage or Azure Data Lake Storage

This is just a demo, in this case, I used the user name and password option, but I recommend to use the Azure Key Vault option.

Install Microsoft Azure Data Factory Integration Runtime, this software will create a secure connection between your local computer to Azure.

Test realized

I created 6 files with no data and 1 file with 32 MB.

Executed the test pipeline.

The results

As we can see, it took less than 2 minutes to load my files

Files loaded inside Azure Blob Storage.

My internet connection

Special thanks to Raza Ali and Manish Viswambharan.

--

--

Caio Moreno
Caio Moreno

Written by Caio Moreno

Solutions Architect @databricks | Professor | PhD | Ex-Microsoft | Ex-Avanade/Accenture | Ex-Pentaho/Hitachi | Ex-AOL | Ex-IT4biz CEO. (Opinions are my own)

Responses (4)