Job Description
<p><p><b>Who You Are :</b><br/><br/> As a seasoned Data Engineer specializing in data engineering, you bring extensive expertise in optimizing data workflows using various database tools like Oracle, BigQuery, and SQL Server.<br/><br/> You possess a deep understanding of ELT/ETL processes, data integration, and have a strong command of Python for data manipulation and automation tasks.<br/><br/> You will possess advanced expertise in working with data platforms like Google Big Query, DBT, Python, and Airflow.<br/><br/> Responsible for designing and maintaining scalable ETL pipelines, optimizing complex data systems, and ensuring smooth data flow across different platforms.<br/><br/> As a Senior Data Engineer, you will also be required to work collaboratively in a team and contribute to building data infrastructure that drives business insights.<br/><br/><b>What Youll Do :</b><br/><br/> - Design, develop, and optimize complex ETL pipelines that integrate large data sets from various sources.<br/><br/> - Build and maintain high-performance data models using Google BigQuery and DBT for data transformation.<br/><br/> - Develop Python scripts for data ingestion, transformation, and automation.<br/><br/> - Implement and manage data workflows using Apache Airflow for scheduling and orchestration.<br/><br/> - Collaborate with data scientists, analysts, and other stakeholders to ensure data availability, reliability, and performance.<br/><br/> - Troubleshoot and optimize data systems, identifying issues and resolving them proactively.<br/><br/> - Work on cloud-based platforms, particularly AWS, to leverage scalability and storage options for data pipelines.<br/><br/> - Ensure data integrity, consistency, and security across systems.<br/><br/> - Take ownership of end-to-end data engineering tasks while mentoring junior team members.<br/><br/> - Continuously improve processes and technologies for more efficient data processing and delivery.<br/><br/> - Act as a key contributor to developing and supporting complex data architectures.<br/><br/><b>What Youll Need :</b></p><p><p><b> </b></p> - Bachelors degree in computer science, Information Technology, or a related field.<br/><br/> - 4+ years of hands-on experience in Data Engineering or related fields, with a strong background in building and optimizing data pipelines.<br/><br/> - Strong proficiency in Google Big Query, including designing and optimizing queries or other equivalent Datawarehouse (Snowflake).<br/><br/> - Knowledge of dbt for data transformation and model management.<br/><br/> - Proficiency in Python for data engineering tasks, including scripting, data manipulation, and automation.<br/><br/> - Solid experience with Apache Airflow for workflow orchestration and task automation.<br/><br/> - Extensive experience in building and maintaining ETL pipelines.<br/><br/> - Familiarity with cloud platforms, particularly AWS (Amazon Web Services), including tools like S3, Lambda, Redshift, or Glue.<br/><br/> - Excellent problem-solving and troubleshooting abilities.<br/><br/> - Strong communication and collaboration skills with the ability to work effectively in a team environment.<br/><br/> - Self-motivated, detail-oriented, and able to work with minimal supervision.<br/><br/> - Ability to manage multiple priorities and deadlines in a fast-paced environment.<br/><br/> - Experience with other cloud platforms (e.g., AWS, Azure) is a plus.<br/><br/> - Knowledge of data warehousing best practices and architecture.<br/><br/><b>WHATS IN IT FOR YOU?</b><br/><br/> At Zinnia, you collaborate with smart, creative professionals who are dedicated to delivering cutting-edge technologies, deeper data insights, and enhanced services to transform how insurance is done.<br/><br/>Apply by completing the online application on the careers section of our website.<br/><br/> We are an Equal Opportunity employer committed to a diverse workforce.<br/><br/> We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability.<br/><br/></p><br/></p> (ref:hirist.tech)