Job Summary:
Seeking an experienced Senior Data Engineer with over 9 years of industry experience to architect, build, and manage advanced data pipelines and platforms in a cloud environment.
The ideal candidate will demonstrate hands-on expertise in ETL workflows, possess deep proficiency with Snowflake, and have a proven track record of implementing robust data governance and security practices.
Key Responsibilities:
- Architect, develop, and optimize scalable data pipelines for ingestion, processing, and storage of large data volumes from diverse sources.
- Design, implement, and maintain end-to-end ELT/ETL workflows for seamless data integration using orchestration tools.
- Manage data warehouse and data lake solutions, emphasizing Snowflake architecture, performance tuning, and cost optimization.
- Implement, monitor, and enforce best practices for data governance, including data cataloguing, lineage, stewardship, access control, audit, and compliance.
- Design and enforce rigorous data security policies and procedures for sensitive and regulated datasets, ensuring alignment with internal and external regulations.
- Ensure high data quality, integrity, and reliability by deploying validation, monitoring, and alerting frameworks.
- Collaborate with data science, analytics, DevOps, and business teams to fulfil analytics, BI, and operational data requirements.
- Drive process automation, workflow optimization, and continuous process improvement in data engineering operations.
- Mentor junior engineers, review code, and champion engineering excellence and compliance within the team.
- Keep up to date on the latest trends in cloud data platforms, governance, and security;
recommend and implementnew technologies as appropriate.
Required Skills & Qualifications:
- Bachelor’s or master’s degree in computer science, Engineering, or related field.
- 9+ years of full-lifecycle data engineering experience within enterprise environments.
- Deep expertise in ETL pipeline development, orchestration (Airflow, Prefect, Dagster), and troubleshooting.
- Advanced proficiency in Snowflake (data modelling, security, optimization, role-based access, etc.).
- Strong foundation in data governance, cataloging, lineage, access management, and regulatory compliance best practices.
- Demonstrated experience designing and implementing data security architectures (encryption, tokenization, masking, monitoring).
- Proficiency with SQL and at least one programming language (Python preferred).
- Exposure to distributed systems, cloud-native data platforms (AWS/GCP/Azure), and scalable storage technologies.
- Excellent analytical, problem-solving, and stakeholder communication skills.
- Experience mentoring or leading teams is an advantage.
Preferred Qualifications:
- Certifications in Snowflake, cloud platforms, or data security are a plus.
- Familiarity with additional data warehouse/lake technologies (Databricks, BigQuery, Redshift, Lake Formation, etc.).
- Experience working in regulated domains (finance, healthcare, insurance) is desirable.