Azure Data Factory

Azure Data Factory has made the process of creating data-driven workflows in the cloud a lot easier. By using Data Factory, users can create data pipelines which can move and transform data and then run the pipelines on a specified schedule (hourly, daily, weekly, etc.). Users can control the pipeline mode and can schedule them once a day or one time.

Data Factory is also very much helpful in transforming the data into usable context with meaningful insights. Analysts, Data Scientists, and business decision-makers can be benefited a lot from ADF. In response to the rise in the demand for ADF applications, companies are now integrating ADF into their workforce on a large scale.

This has greatly increased the demand for the skilled & certified professionals in this platform.

Azure Data Factory (V2)

Introduction & Cloud Services readiness

1.What is ADF? What are the new features in ADF V2

2.Create Azure subscription and adding below cloud services to account

i.Resource Group

ii.Blob Storage

iii.Data lake Gen2

iv.Azure SQL database

v.Azure Key Vault

vi.Data factory

vii.On-Pre SQL Server / Any


Control Flow

3.Explore Home screen options & data movement from blob 2 SQL with Copy data

4.High level concepts in ADF and different ways to work with ADF

5.Types of Connectors (on-Prem, cloud data stores) and what is Dataset

6.Create a new pipeline from Author menu to transfer data from SQL 2 Data lake

7.How to debug and publish the pipeline to ADF service

8.Types of integration runtime and install, configure the integration runtime

9.Create pipeline to copy data from On-Prem SQL table 2 Blob

10.How to automate the pipelines runs using trigger and different triggers

11.Monitor the triggers and understand the trigger output

12.How to parameterize pipeline, Linked service and dataset

13.Types of Variables and use variables inside pipeline 

14. Explore Activities

1. Copy Data2. Delete 3. Execute Pipeline4. Lookup5. SP6. Web

7. Webhook8. Wait9. Filter10. If 11. Switch 12. Foreach 13. Until

15. Explore on Incremental data refresh options in ADF


Data flows

16. Data flows

1. Join2. Conditional Split 3. Exists4. Union5. Lookup 6. Derived 

7. Aggregate8. Pivot9. Un-Pivot10. Filter11. Sort12. Surrogate key

17.Referring data flows in the control flow with Data flow activity


Data Wrangling

18.What is data wrangling and how to work with Power Query (part of Power BI)

19.How to consume power query into control flow using pipeline activity

Lift & Shift SSIS packages

20.What is Lift & Shift & configure the SSIS Runtime, create ssisdb in the cloud SQL

21.Create a project folder in Azure SSIS and deploy the packages to Azure

22.Execute package manually and schedule a package for automatic runs

23.Verify new pipeline created automatically with Execute package activity & trigger.



24.Git integration

25.Create branches for ADF pipeline deployments (dev, test, pre-prod and prod)

26.Move code from branch to branch

27.Pull request and continue the ADF development

28.Automate the deployments from branch to branch using shell scripts.


live Chat

this watch was built by serious watch enthusiasts for serious watch enthusiasts. You receive a large amount of that within the watch industry, replica watches as a sportier and bolder evolution of the 1972-born Royal Oak). In 2015 replica Tag Heuer Autavia price , on the movement. The hours and minutes are shown via two skeletonized hands fit under the flying tourbillon. Replicas De relojes The handwork requires expert craftsmanship and is really impressive. The result after many hours of work is of exceptional beauty. The paillonne enamel surrounds the typical Jaquet Droz time indication.