Job Description
<p><p><b>Job Summary :</b></p><p><p><b><br/></b></p>We are seeking a highly skilled Full Stack Data Engineer to design, develop, and manage scalable data pipelines, storage, and transformation solutions.</p><p><br/>The role requires deep expertise in cloud-based data platforms, data warehousing, data lakehouse design, workflow automation, and data integration to support business intelligence and advanced analytics.</p><p><br/>The ideal candidate will have a strong background in data engineering, cloud technologies, and full-stack software development, with a focus on performance optimization, security (especially data segregation), and automation.</p><p><br/></p><p><b>Key Responsibilities :</b></p><p><p><b><br/></b></p>- Design, build, and maintain scalable and secure data pipelines for analytics and reporting.<br/><br/></p><p>- Develop and optimize data warehouse and data lakehouse architectures.<br/><br/></p><p>- Implement ETL/ELT processes, data modeling, and API integrations.<br/><br/></p><p>- Automate workflows and orchestrations using Airflow, Dagster, or Prefect.<br/><br/></p><p>- Ensure data quality, validation, governance, and compliance (GDPR, CCPA).<br/><br/></p><p>- Collaborate with cross-functional teams to support data-driven decision-making.<br/><br/></p><p>- Manage infrastructure with cloud services (AWS/Azure/GCP) and IaC tools like Contribute to CI/CD pipelines, DevOps practices, and containerization (Docker, Skills Required :</b></p><p><b><br/></b></p><p><b>- Data Engineering & Warehousing : </b> Snowflake (must have), DBT (must have), SnapLogic, ETL/ELT, APIs, Data Lakehouse Programming & Scripting : </b> Advanced SQL, Python, DBT, Bash/Shell Cloud & Infrastructure : </b> AWS, Azure, or GCP; Terraform; CloudFormation; Security (IAM, VPN, Data Processing & Orchestration : </b> Kafka, Kinesis, Apache Airflow, Dagster, DevOps & CI/CD : </b> Git, GitHub Actions, Jenkins, Docker, Data Governance & Quality : </b> Data validation, metadata management, compliance with Profile :</b></p><p><br/></p><p>- 5+ years of experience in data engineering and cloud platforms.</p><p><br/></p>- Proven track record in end-to-end data pipeline development and automation.<br/><br/></p><p>- Strong problem-solving, analytical, and communication skills.<br/><br/></p><p>- Ability to work independently in a remote-first environment</p><br/></p> (ref:hirist.tech)