Azure Data Factory Training - Table of Contents
1. Introduction to Azure Data Factory
What is Azure Data Factory (ADF)?
Importance of ADF in data integration
Key features and capabilities
Comparison with other ETL tools
2. ADF Architecture and Components
Overview of ADF architecture
Data pipeline components:
Pipelines
Activities
Datasets
Linked Services
Triggers
Integration Runtimes
3. Getting Started with ADF
Setting up an Azure account
Creating your first Data Factory
Navigating the ADF UI (Azure Portal)
4. Data Movement in ADF
Copy Activity: Moving data between sources
Data Flow vs Copy Activity
Supported data sources and destinations
Handling incremental data loads
5. Data Transformation Using Data Flows
Introduction to Data Flows
Mapping Data Flows vs Wrangling Data Flows
Data transformation operations (joins, aggregations, filters, etc.)
Debugging and monitoring data flows
6. Orchestration with Pipelines and Activities
Control flow activities (ForEach, If Condition, Switch)
Execution activities (Stored Procedure, Lookup, Web)
Handling dependencies between activities
Error handling and retries
7. Triggers and Scheduling in ADF
Types of triggers: Manual, Scheduled, Tumbling Window, Event-based
Scheduling best practices
Managing trigger failures
8. Monitoring and Debugging in ADF
Using ADF Monitoring tools
Debugging pipelines and activities
Logging and alerts
9. Security and Access Control
Role-Based Access Control (RBAC)
Managed Identity and Key Vault Integration
Secure data transfer with private endpoints
10. Integration with Other Azure Services
Azure Data Lake and Blob Storage
Azure SQL Database and Synapse Analytics
Azure Functions and Logic Apps
Power BI Integration
11. CI/CD and DevOps in ADF
Using Git integration in ADF
Deployment with Azure DevOps
Best practices for version control and releases
12. Real-World Use Cases and Hands-on Projects
Building a data pipeline from Azure Blob Storage to SQL
Data transformation using Mapping Data Flows
Automating ETL workflows with event triggers
End-to-end project: Ingest, transform, and visualize data