We are looking for a skilled 6+ years of Snowflake Developer with hands-on experience in Python, SQL, and Snowpark to join our data engineering team.
You will be responsible for designing and building scalable data pipelines, developing Snowpark-based data applications, and enabling advanced analytics solutions on the Snowflake Data Cloud platform.
Key Responsibilities
- Develop and maintain robust, scalable, and high-performance data pipelines using Snowflake SQL, Python, and Snowpark.
- Use Snowpark (Python API) to build data engineering and data science workflows within the Snowflake environment.
- Perform advanced data transformation, modeling, and optimization to support business reporting and analytics.
- Tune queries and warehouse usage for cost and performance optimization.
- Leverage Azure data services for data ingestion, orchestration, observability etc.
- Implement best practices for data governance, security, and data quality within Snowflake.
Required Skills
- 3+ years of hands-on experience with Snowflake development and administration.
- Strong command of SQL for complex queries, data modeling, and transformations.
- Proficient in Python, especially in the context of data engineering and Snowpark usage.
- Working experience with Snowpark for building data pipelines or analytics applications.
- Understanding of data warehouse architecture, ELT/ETL processes, and cloud data platforms.
- Familiarity with version control systems (e.g., Git) and CI/CD pipelines.
Preferred Qualifications
- Experience with cloud platforms like AWS, Azure, or GCP.
- Knowledge of orchestration tools such as Airflow or dbt.
- Familiarity with data security and role-based access control (RBAC) in Snowflake.
- Snowflake certifications are a plus.
- DBT tool knowledge
Soft Skills
- Strong analytical and problem-solving capabilities.
- Ability to work independently and in a collaborative team environment.
- Excellent communication and documentation skills.