Job Description
<p><p><b>Position :</b> Senior Data Engineer, Snowflake & DBT, Hyderabad.<br/><br/><b>Department :</b> Information Technology | <b>Role :</b> Full-time | <b>Experience :</b> 5 to 10 Years | <b>Number of Positions :</b> 2 | <b>Location :</b> Hyderabad.<br/><br/><b>Skillset :</b><br/><br/> Snowflake, DBT, Azure Data Factory, Python Programming, SQL, Medallion Architecture, Azure Data Lake, Jinja, Project Owership, Team Handling, Performance Optimization, DBT Cloud, ETL/ELT, Security and Governance, HIPPA, RBAC, Excellent English communication skills.<br/><br/><b>Job Description :</b><br/><br/><b>About Us :</b><br/><br/> We provide companies with innovative technology solutions for everyday business problems.<br/><br/> Our passion is to help clients become intelligent, information-driven organizations, where fact-based decision-making is embedded into daily operations, which leads to better processes and outcomes.<br/><br/> Our team combines strategic consulting services with growth-enabling technologies to evaluate risk, manage data, and leverage AI and automated processes more effectively.<br/><br/> With deep, big four consulting experience in business transformation and efficient processes, wr are a game-changer in any operations strategy.<br/><br/><b>Job Description :</b><br/><br/> We are seeking a highly skilled and experienced Senior Data Engineer to join our growing data team.<br/><br/> In this role, you will lead the design and implementation of scalable, high-performance data pipelines using Snowflake and dbt, define architectural best practices, and drive data transformation at scale.<br/><br/> Youll work closely with clients to translate business needs into robust data solutions and play a key role in mentoring junior engineers, enforcing standards, and delivering production-grade data platforms.<br/><br/><b>Key Responsibilities :</b><br/><br/> - Architect and implement modular, test-driven ELT pipelines using dbt on Snowflake.<br/><br/> - Design layered data models (e.g., staging, intermediate, mart layers / medallion architecture) aligned with dbt best practices.<br/><br/> - Lead ingestion of structured and semi-structured data from APIs, flat files, cloud storage (Azure Data Lake, AWS S3), and databases into Snowflake.<br/><br/> - Optimize Snowflake for performance and cost : warehouse sizing, clustering, materializations, query profiling, and credit monitoring.<br/><br/> - Apply advanced dbt capabilities including macros, packages, custom tests, sources, exposures, and documentation using dbt docs.<br/><br/> - Orchestrate workflows using dbt Cloud, Airflow, or Azure Data Factory, integrated with CI/CD pipelines.<br/><br/> - Define and enforce data governance and compliance practices using Snowflake RBAC, secure data sharing, and encryption strategies.<br/><br/> - Collaborate with analysts, data scientists, architects, and business stakeholders to deliver validated, business-ready data assets.<br/><br/> - Mentor junior engineers, lead architectural/code reviews, and help establish reusable frameworks and standards.<br/><br/> - Engage with clients to gather requirements, present solutions, and manage end-to-end project delivery in a consulting setup.<br/><br/><b>Required Qualifications :</b><br/><br/> 5 to 8 years of experience in data engineering roles, with 3+ years of hands-on experience working with Snowflake and dbt in production environments.<br/><br/><b>Technical Skills :</b><br/><br/><b>Cloud Data Warehouse & Transformation Stack :</b><br/><br/> - Expert-level knowledge of SQL and Snowflake, including performance optimization, storage layers, query profiling, clustering, and cost management.<br/><br/> - Experience in dbt development : modular model design, macros, tests, documentation, and version control using Git.<br/><br/><b>Orchestration and Integration :</b><br/><br/> - Proficiency in orchestrating workflows using dbt Cloud, Airflow, or Azure Data Factory.<br/><br/> - Comfortable working with data ingestion from cloud storage (e.g., Azure Data Lake, AWS S3) and APIs.<br/><br/><b>Data Modelling and Architecture :</b><br/><br/> - Dimensional modelling (Star/Snowflake schemas), Slowly changing dimensions.<br/><br/> - Knowledge of modern data warehousing principles.<br/><br/> - Experience implementing Medallion Architecture (Bronze/Silver/Gold layers).<br/><br/> - Experience working with Parquet, JSON, CSV, or other data formats.<br/><br/><b>Programming Languages :</b><br/><br/> - Python : For data transformation, notebook development, automation.<br/><br/> - SQL : Strong grasp of SQL for querying and performance tuning.<br/><br/> - Jinja (nice to have) : Exposure to Jinja for advanced dbt development.<br/><br/><b>Data Engineering & Analytical Skills :</b><br/><br/> - ETL/ELT pipeline design and optimization.<br/><br/> - Exposure to AI/ML data pipelines, feature stores, or MLflow for model tracking (good to have).<br/><br/> - Exposure to data quality and validation frameworks.<br/><br/><b>Security & Governance :</b><br/><br/> - Experience implementing data quality checks using dbt tests.<br/><br/> - Data encryption, secure key management and security best practices for Snowflake and dbt.<br/><br/><b>Soft Skills & Leadership :</b><br/><br/> - Ability to thrive in client-facing roles with competing/changing priorities and fast-paced delivery cycles.<br/><br/> - Stakeholder Communication : Collaborate with business stakeholders to understand objectives and convert them into actionable data engineering designs.<br/><br/> - Project Ownership : End-to-end delivery including design, implementation, and monitoring.<br/><br/> - Mentorship : Guide junior engineers and establish best practices; Build new skill in the team.<br/><br/> - Agile Practices : Work in sprints, participate in scrum ceremonies, story estimation.<br/><br/><b>Education :</b><br/><br/>- Bachelors or masters degree in computer science, Data Engineering, or a related field.<br/><br/>- Certifications such as Snowflake SnowPro Advanced, dbt Certified Developer are a plus.<br/><br/><b>Additional Information :</b><br/><br/><b>Why Join Us ?</b><br/><br/> - Work on modern data architecture at scale using cutting-edge Azure technologies.<br/><br/> - Collaborate with a dynamic and talented team of data professionals.<br/><br/> - Grow your leadership skills through hands-on project ownership and mentorship.<br/><br/> - Competitive compensation, flexible work options, and continuous learning culture.<br/><br/> This is work from office role in Hyderabad (5 Days/week).<br/><br/> There are 2 rounds in the interview process.<br/><br/><b>Required Qualification :</b><br/><br/>- Bachelor of Engineering Bachelor of Technology (B.E./B.Tech.) IT/CS/ E&CE/MCA.<br/><br/>- With a fast-growing analytics, business intelligence, and IT automation company.</p><br/></p> (ref:hirist.tech)