S*****m
About Candidate
Highly skilled Azure Data Engineer with 5 years of hands-on experience building large-scale ETL/ELT pipelines, cloud data platforms,
batch-processing systems and Lake house architectures. Expert in SQL, Python, PySpark, Databricks, Azure Data Factory, Data Modelling,
Data Warehouse, Delta Lake and CI/CD DevOps. Strong experience delivering cost-optimized and high performing solutions aligned
with UAE/GCC standards.
Nationality
Iqama
Location
Education
Work & Experience
• Delivered end-to-end ELT pipelines using Azure Data Factory, Databricks (PySpark) and ADLS Gen2 for enterprise-scale ingestion and transformation. • Built optimized batch pipelines with incremental loads, CDC, SCD Type-2, watermarking and deduplication for stable data refresh cycles. • Implemented a full Delta Lake Lakehouse (Bronze, Silver, Gold) with ACID transactions, schema evolution and time-travel capabilities. • Designed star-schema dimensional models (Facts, SCD Dimensions) and analytics-ready data marts in Azure SQL. • Improved PySpark performance using broadcast joins, caching, partition pruning, Z-Ordering and cluster tuning (35–40% faster jobs). • Automated CI/CD using GitHub and Azure DevOps for ADF JSON, Databricks notebooks, SQL scripts and configuration deployments. • Built a data quality framework (null checks, schema rules, duplication, PK/FK validation) integrated into ADF and Databricks pipelines. • Integrated ADF with APIs, Azure SQL, ADLS and multi-format ingestion (CSV, Parquet, JSON, Avro). • Developed monitoring dashboards using Azure Monitor + Log Analytics ensuring 99.5% pipeline availability and SLA compliance.
• Designed and automated end-to-end SQL and Python ETL workflows, reducing manual effort and improving data accuracy. • Built interactive Power BI dashboards for Finance, Sales & Operationsreducing manual Excel reporting by 40%. • Optimized SQL queries through indexing, join refactoring and execution plan tuning, improving performance by 50%. • Implemented strong data quality controls audits, cleansing rules, validation checks and reconciliation logic. • Used Excel (Pivot Tables, Power Query, XLOOKUP) for data validation, reconciliation and quick businessreporting. • Collaborated with business teams to translate requirementsinto automated SQL/Python-based reporting solutions
• Created automated Excel reports using Pivot Tables, Power Query and advanced formulas for daily/weekly business tracking. • Cleaned, transformed and validated raw data in Excel to ensure accuracy before reporting. • Built interactive dashboards and performance summaries for Finance, Sales and Operations teams. • Performed data reconciliation using VLOOKUP/XLOOKUP, Pivot Tables and conditional checks. • Analyzed trends, variances and business KPIs using Excel charts, slicers and dynamic tables.


