Skip to main content

Copy Data from Azure Data Lake Gen 1 to Gen 2 using Azure Data Factory

Today, we will discuss how to use Azure Data factory to copy data from Azure Data Lake Gen1 into Data Lake Gen2.

Prerequisites:

  1. Azure Data Lake Gen1

  2. Data Factory

  3. Azure storage with Data Lake Gen2 enabled

Refer the below steps to copy your data:

  1. Open your Azure portal and go to Data Factory then click on Author & Monitor.

  2. It will open a Data Integration Application in new tab. Now click on Copy Data.copy2

  3. Fill the details in the Properties page and click Nextcopy3

  4. Select the source, then Create a new connection for Data Lake Gen1copy4

  5. Fill out the required parameters. You can use either Service Principal or Managed Identity for Azure resources to authenticate your Azure Data Lake Storage Gen1.copy5

  6. Test the connection, then click Createcopy6

  7. Select your folder in the Dataset as shown belowcopy7

  8. Select the destination & create a new connection for ADLS Gen2.copy11

  9. Fill out the required parameters. You can use either Service Principal or Managed Identity for Azure resources to authenticate your Azure Data Lake Storage Gen1copy8

  10. Test the connection, then click Create

  11. Select the destination & select the folder in the Dataset then click Next

  12. Check and Verify the Summary, then click Nextcopy9

  13. Pipeline will be executed right away.

  14. Monitor the Status

  15. Navigate to the Monitor tab, and see the progresscopy10

  16. You can also view more details by viewing the Activity Runs and view Details



As always do let us know if you have questions or comments using the comments section below!

Popular posts from this blog

Comparison between Azure Application Gateway V1 and V2

Microsoft has announced new version of Azure Application Gateway and its Web Application Firewall module (WAF). In this article, we will discuss about the enhancements and new highlights that are available in the new SKUs i.e. Standard_v2 and WAF_v2. Enhancements and new features: Scalability: It allows you to perform scaling of the number of instances on the traffic. Static VIP: The VIP assigned to the Application Gateway can be static which will not change over its lifecycle. Header Rewrite: It allows you to add, remove or update HTTP request and response headers on application gateway. Zone redundancy: It enables application gateway to survive zonal failures which allows increasing the resilience of applications. Improved Performance: Improvement in performance during the provisioning and during the configuration update activities. Cost: V2 SKU may work out to be overall cheaper for you relative to V1 SKU. For more information, refer Microsoft prici

Difference between Azure Front Door Service and Traffic Manager

Azure Front Door Service is Microsoft’s highly available and scalable web application acceleration platform and global HTTP(s) load balancer. Azure Front Door Service supports Dynamic Site Acceleration (DSA), SSL offloading and end to end SSL, Web Application Firewall, cookie-based session affinity, URL path-based routing, free certificates and multiple domain management. In this article, I will compare Azure Front Door to Azure Traffic Manager in terms of performance and functionality. Similarity: Azure Front Door service can be compared to Azure Traffic Manager in a way that this also provides global HTTP load balancing to distribute traffic across different Azure regions, cloud providers or even with your on-premises. Both AFD & Traffic Manager support: Multi-geo redundancy: If one region goes down, traffic routes to the closest region without any intervention. Closest region routing: Traffic is automatically routed to the closest region. Differences: Azu

Install Solr as an Azure App Service

After Sitecore 9.0.2, Solr is a supported search technology for Sitecore Azure PAAS deployments. In this article, we will install SOLR service 8.4.0 in Azure App Service for Sitecore 10. 1. Create Azure App Service Login to Azure and create Azure App service. Make sure Runtime stack should be Java. 2. Download Solr Download Solr 8.4.0 from https://archive.apache.org/dist/lucene/solr/ Extract the files and add the below web.config file in the Solr package. <?xml version="1.0" encoding="UTF-8"?> <configuration>  <system.webServer>      <handlers>      <add  name="httpPlatformHandler"            path="*"            verb="*"            modules="httpPlatformHandler"            resourceType="Unspecified" />    </handlers>    <httpPlatform processPath="%HOME%\site\wwwroot\bin\solr.cmd"        arguments="start -p %HTTP_PLATFORM_PORT%"

Azure Machine Learning public preview announcement //Build, May 2021

Azure service updates Azure Machine Learning public preview announcement //Build, May 2021 New feature: Prebuilt Docker images for Inferencing, now in public preview. Click here for more information.

Export BACPAC file of SQL database

When you need to create an archive of an Azure SQL database, you can export the database schema and data to a BACPAC file. A BACPAC file can be stored in Azure blob storage or in local storage in an on-premises location and later imported back into Azure SQL Database or into a SQL Server on-premises installation. Let's learn some of the ways to export BACPAC file. Export BACPAC using Azure Portal Open your SQL Database and select Export. Fill the parameters as shown below. Select your storage account container & enter your SQL Server admin login. To check the status of your database export. Open your SQL Database server containing the database being exported. Go to Settings and then click Import/Export history Export BACPAC using SSMS Login Azure SQL Database by SSMS. Right-click the database -> Tasks -> Export Data-tier Application Save the .bacpac file into local disk. Export BACPAC using SQLPackage There is a command line tool that you can also choose to