We are seeking a Data Engineer with strong hands-on experience in Snowflake, Python, Airflow, and SQL.
The ideal candidate will be proficient in data solution design and development, adept at requirement gathering, and possess strong analytical and problem-solving skills.
This role involves building and maintaining robust data pipelines, ensuring efficient data flow and accessibility for business intelligence and analytics.
Key Responsibilities
- Data Pipeline Development: Design, develop, and maintain scalable and efficient data pipelines using Snowflake, Python, and Airflow.
- SQL Development: Write complex and optimized SQL queries for data extraction, transformation, and loading (ETL) within Snowflake.
- Snowflake Expertise: Leverage in-depth knowledge of Snowflake features for data warehousing, performance tuning, and schema design.
- Python Scripting: Develop robust Python scripts for data manipulation, automation, and integration tasks.
- Airflow Orchestration: Implement and manage workflows using Apache Airflow for scheduling and orchestrating data pipelines.
- Requirement Gathering: Proactively gather requirements from stakeholders, translating business needs into technical data solutions.
- Design & Development: Contribute to the design and development of data architectures and data models.
- Troubleshooting & Optimization: Utilize strong analytical and problem-solving skills to troubleshoot data-related issues and optimize pipeline performance.
Required Skills and Experience
- Strong hands-on experience in Snowflake.
- Proficiency in Python for data engineering tasks.
- Experience with Apache Airflow for workflow orchestration.
- Expertise in SQL for data querying and manipulation.
- Proficiency in data solution design and development.
- Demonstrated ability in requirement gathering.
- Strong analytical and problem-solving skills.
Mandatory Skills
- Snowflake
- Python
- Airflow
- SQL
Skills Required
snowflake , Python Programming, Airflow, Sql Programming, Problem-solving, Analytical Skills