Job Description
<p><p><b>Job Description :</b><br/><br/>We are seeking a highly skilled and experienced Senior Data Engineer to join our growing data team.
In this role, you will lead the design and implementation of scalable, high-performance data pipelines using Snowflake and dbt, define architectural best practices, and drive data transformation at scale.
</p><p><br/></p><p>Youll work closely with clients to translate business needs into robust data solutions and play a key role in mentoring junior engineers, enforcing standards, and delivering production-grade data platforms.<br/><br/><b>Key Responsibilities :</b><br/><br/>- Architect and implement modular, test-driven ELT pipelines using dbt on Snowflake.<br/><br/></p><p>- Design layered data models (e.g., staging, intermediate, mart layers / medallion architecture) aligned with dbt best practices.<br/><br/></p><p>- Lead ingestion of structured and semi-structured data from APIs, flat files, cloud storage (Azure Data Lake, AWS S3), and databases into Snowflake.<br/><br/></p><p>- Optimize Snowflake for performance and cost : warehouse sizing, clustering, materializations, query profiling, and credit monitoring.<br/><br/></p><p>- Apply advanced dbt capabilities including macros, packages, custom tests, sources, exposures, and documentation using dbt docs.<br/><br/></p><p>- Orchestrate workflows using dbt Cloud, Airflow, or Azure Data Factory, integrated with CI/CD pipelines.<br/><br/></p><p>- Define and enforce data governance and compliance practices using Snowflake RBAC, secure data sharing, and encryption strategies.<br/><br/></p><p>- Collaborate with analysts, data scientists, architects, and business stakeholders to deliver validated, business-ready data assets.<br/><br/></p><p>- Mentor junior engineers, lead architectural/code reviews, and help establish reusable frameworks and standards.<br/><br/></p><p>- Engage with clients to gather requirements, present solutions, and manage end-to-end project delivery in a consulting setup<br/><br/><b>Technical Skills : </b><br/><br/>Cloud Data Warehouse & Transformation Stack : <br/><br/>- Expert-level knowledge of SQL and Snowflake, including performance optimization, storage layers, query profiling, clustering, and cost management.<br/><br/>- Experience in dbt development : modular model design, macros, tests, documentation, and version control using Git.<br/><br/>Data Modelling and Architecture : <br/><br/>- Dimensional modelling (Star/Snowflake schemas), Slowly changing dimensions.<br/><br/></p><p>- Knowledge of modern data warehousing principles.<br/><br/></p><p>- Experience implementing Medallion Architecture (Bronze/Silver/Gold layers).<br/><br/></p><p>- Experience working with Parquet, JSON, CSV, or other data formats.<br/><br/>Programming Languages : <br/><br/>Python : For data transformation, notebook development, automation.<br/><br/></p><p>SQL : Strong grasp of SQL for querying and performance tuning.<br/><br/></p><p>Jinja (nice to have) : Exposure to Jinja for advanced dbt development.<br/><br/>Data Engineering & Analytical Skills : <br/><br/>- ETL/ELT pipeline design and optimization.<br/><br/></p><p>- Exposure to AI/ML data pipelines, feature stores, or MLflow for model tracking (good to have).<br/><br/></p><p>- Exposure to data quality and validation frameworks.<br/><br/>Security & Governance : <br/><br/>- Experience implementing data quality checks using dbt tests.<br/><br/></p><p>- Data encryption, secure key management and security best practices for Snowflake and dbt.<br/><br/><b>Soft Skills & Leadership : </b><br/><br/>- Ability to thrive in client-facing roles with competing/changing priorities and fast-paced delivery cycles.</p><br/></p> (ref:hirist.tech)