- Proven experience in Databricks  (Delta Lake, Spark, PySpark, SQL).
 
 
- Strong expertise in data integration  from multiple sources (SQL Server, MongoDB, InfluxDB).
 
 
- Hands-on experience with ETL/ELT pipeline development  and orchestration (e.g., Airflow, ADF, or equivalent).
 
 
- Proficiency in data modeling , data warehousing concepts, and performance tuning.
 
 
- Familiarity with real-time data streaming  (Kafka, Azure Event Hubs, or similar).
 
 
- Strong programming skills in Python  and SQL .
 
 
- Experience with cloud platforms  (Azure).
 
 
- Excellent problem-solving skills and the ability to work in a fast-paced environment.
 
 
Familiar with Change Data Capture