Job Description
            
                <p></p><p><b>Description :</b><br/><br/>About the Role :<br/><br/>We are seeking an experienced Snowflake Developer with strong expertise in SQL, Python, and data engineering best practices.<br/><br/>The ideal candidate will be responsible for designing, developing, and maintaining scalable data pipelines, transformations, and integrations using the Snowflake Data Cloud platform.<br/><br/>You will collaborate closely with data engineers, data analysts, BI developers, and business stakeholders to ensure efficient data flow, high performance, and reliability across the organizations data ecosystem.<br/><br/><b>Key Responsibilities :</b><br/><br/>- Design, develop, and maintain data pipelines, ETL/ELT processes, and data models in Snowflake.<br/><br/>- Build and optimize complex SQL queries, stored procedures, and views for analytical and operational use cases.<br/><br/>- Implement and manage data ingestion from diverse sources (structured, semi-structured, and unstructured data).<br/><br/>- Work with Python to automate data workflows, transformations, and orchestration tasks.<br/><br/>- Leverage Snowflake features such as Snowpipe, Streams, Tasks, and Time Travel for efficient data management.<br/><br/>- Collaborate with data architects and engineers to design scalable data warehouse solutions and implement best practices for performance optimization.<br/><br/>- Develop and maintain data quality checks, ensuring consistency, integrity, and reliability across datasets.<br/><br/>- Integrate Snowflake with cloud platforms (AWS, Azure, or GCP) and data orchestration tools (Airflow, dbt, or similar).<br/><br/>- Monitor and troubleshoot Snowflake performance issues, ensuring query optimization and cost management.<br/><br/>- Ensure data security, access control, and compliance with company and industry standards.<br/><br/>- Partner with BI and analytics teams to provide high-quality, curated data for dashboards and reports.<br/><br/><b>Required Qualifications :</b><br/><br/>- Bachelors degree in Computer Science, Information Systems, Engineering, or related field.<br/><br/>- 5 to 8 years of professional experience in data engineering, ETL development, or database programming.<br/><br/>- Minimum 3+ years of hands-on experience working with Snowflake.<br/><br/>- Expertise in SQL (advanced query writing, optimization, stored procedures).<br/><br/>- Proficiency in Python for data manipulation, automation, and integration.<br/><br/>- Experience with ETL/ELT tools (e.g., dbt, Talend, Informatica, Matillion, or AWS Glue).<br/><br/>- Familiarity with cloud platforms (AWS, Azure, or GCP) and related services (S3, Lambda, Data Factory, BigQuery).<br/><br/>- Strong understanding of data warehousing concepts, data modeling (Star/Snowflake schema), and metadata management.<br/><br/>- Hands-on experience with performance tuning, query optimization, and resource monitoring in Snowflake.<br/><br/>- Knowledge of version control systems (Git) and CI/CD pipelines for data deployments</p><br/><p></p> (ref:hirist.tech)