Job Description
            
                Job Title:  Lead Data Engineer – Snowflake & dbt
Location:  Bangalore (Hybrid)
Company:  Lumen Data
About Lumen Data
Lumen Data is a leading consulting firm specializing in enterprise data management, analytics, and cloud modernization.
We partner with clients across industries to deliver trusted data solutions that drive innovation and business growth.
Role Overview
We are seeking an experienced  Lead Data Engineer  with strong expertise in  Snowflake  and  dbt  to design and implement scalable, high-performing data solutions.
The ideal candidate will have proven experience in building modern data pipelines, mentoring engineering teams, and driving best practices for cloud-based data platforms.
Key Responsibilities
Lead the design, development, and optimization of data pipelines and data warehouse solutions on  Snowflake.
Work with Snowflake features including:
Types of Tables, Storage Integration, Internal & External Stages, Streams, Tasks, Views, Materialized Views, Time Travel, Fail Safe, Micro-partitions, Warehouses, RBAC, COPY Command, File Formats (CSV, JSON, XML), Snowpipe, Stored Procedures (SQL/Java Script/Python).
Develop and maintain  dbt  models for data transformation, testing, and documentation.
dbt: create, run, and build models; schedule jobs; manage dependencies; use macros; Jinja templating (optional).
Collaborate with cross-functional teams — data architects, analysts, and business stakeholders — to deliver robust data solutions.
Ensure high standards of  data quality, governance, and security  across pipelines and platforms.
Use  Airflow  (or similar orchestration tools) to schedule and monitor workflows.
Integrate data from multiple sources using tools such as  Fivetran ,  Qlik Replicate , or  IDMC  (experience in at least one).
Provide technical leadership, mentoring, and guidance to junior engineers.
Optimize cost, performance, and scalability of cloud-based data environments.
Contribute to architectural decisions, code reviews, and best practices.
Implement  CI/CD  workflows using  Bitbucket  or  Git Hub  (experience in at least one).
Work with data modeling techniques such as  Entity (Sub-Dim, Dim, Facts)  and  Data Vault (Hub, Link, Sat).
Required Skills & Experience
8–12 years of overall experience in  Data Engineering , including at least 3–4 years in a lead role.
Strong hands-on expertise in  Snowflake  (data modeling, performance tuning, query optimization, security, and cost management).
Proficiency in  dbt  (core concepts, macros, testing, documentation, and deployment).
Solid programming skills in  Python  (for data processing, automation, and integration).
Experience with  workflow orchestration  tools such as  Apache Airflow.
Exposure to  ELT/ETL  tools.
Strong understanding of  modern data warehouse architectures , data governance, and cloud-native environments.
Excellent problem-solving, communication, and leadership skills.
Good to Have
Hands-on experience with  Databricks  (Py Spark, Delta Lake, MLflow).
Exposure to other cloud platforms —  AWS ,  Azure , or  GCP.
Experience in building  CI/CD pipelines  for data workflows.
Why Join Lumen Data?
Opportunity to work with cutting-edge cloud and data technologies.
Collaborative, innovative, and growth-oriented work culture.
Hybrid work model with flexibility and continuous learning opportunities.
Be part of a team shaping enterprise data strategies for global clients.