Data Engineer
Our Enterprise Data & Analytics (EDA) is looking for a Data Engineer to join our growing data engineering team.
We are a globally distributed remote first team.
You’ll work in a collaborative Agile environment using the latest engineering best practices with involvement in all aspects of the software development lifecycle.
You will craft and develop curated data products, applying standard architectural & data modeling practices .
You will be primarily developing Data Warehouse Solutions using technologies such as dbt, Airflow, terraform.
Collaborate with team members and business partners to collect business requirements, define successful analytics outcomes and design data models
Use best engineering practices such as version control system, CI/CD, code review, pair programming
Design, build, and maintain ELT pipelines in Enterprise Data Warehouse to ensure reliable business reporting
Design & build ELT based data models using SQL &
Build analytics solutions that provide practical insights into customer 360, finance, product, sales and other key business domains
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery
Work with data and analytics experts to strive for greater functionality in our data systems
Basic Qualifications
3+ years of data / analytics engineering experience building, working & maintaining data pipelines & ETL processes on big data environments
Basic knowledge in modern as well as classic Data Modeling - Kimball, Innmon, etc
Experience with any of the programming language: Python, Go, Java, Scala, we use primarily Python
SQL knowledge and experience working with Cloud columnar databases (We use Snowflake) as well as working familiarity with a variety of databases
Familiarity with processes supporting data transformation, data structures, metadata, dependency and workload management
Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
Strong communication skills and be adaptive for changing requirements and tech stack.
Preferred Qualifications
Demonstrated experience in one or many business domains
1+ completed projects with dbt
Proficient knowledge in SQL and/or python
Experience using Airflow as data orchestration tool
ELT (Snowflake, dbt, Airflow, Kafka, HighTouch)
BI (Tableau, Looker)
Infrastructure (GCP, AWS, Kubernetes, Terraform, Github Actions)
Please note that Zendesk can only hire candidates who are physically located and plan to work from Karnataka or Maharashtra.
Please refer to the location posted on the requisition for where this role is based.
Hybrid: In this role, our hybrid experience is designed at the team level to give you a rich onsite experience packed with connection, collaboration, learning, and celebration - while also giving you flexibility to work remotely for part of the week.
This role must attend our local office for part of the week.
The specific in-office schedule is to be determined by the hiring manager.