Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Snowflake Data Engineer PySpark/ETL.
India Jobs Expertini

Urgent! Snowflake Data Engineer - PySpark/ETL Job Opening In Pune – Now Hiring Jobtravia Pvt. Ltd.

Snowflake Data Engineer PySpark/ETL



Job description

<p><p><b>Job Title :</b> Snowflake Data Engineer<br/><br/> <b>Location :</b> Pune, Hyderabad | Hybrid<br/><br/> <b>Experience :</b> 5+ Years<br/><br/><b>About the Role :</b><br/><br/>We are looking for a Snowflake Data Engineer with strong expertise in data architecture, pipeline development, and cloud-native solutions.

Youll play a key role in designing, optimizing, and maintaining scalable data platforms that transform raw data into actionable insights.

If you enjoy solving complex data challenges and want to work with cutting-edge cloud technologies, this role is for you.<br/><br/> <b>Key Responsibilities :</b><br/><br/><b>Data Engineering & Pipeline Development :</b><br/><br/>- Design, build, and maintain scalable, efficient, and secure data pipelines.<br/><br/>- Implement ETL/ELT processes leveraging Snowflake and Python/PySpark.<br/><br/><b>Data Modeling & Architecture :</b><br/><br/>- Develop and optimize data models, schemas, and warehouses to support analytics and BI.<br/><br/>- Ensure data integrity, performance tuning, and compliance with governance standards.<br/><br/><b>Snowflake Expertise :</b><br/><br/>- Leverage advanced Snowflake features : Snowpipe, Streams, Tasks, Time Travel, Data Sharing.<br/><br/>- Optimize query performance and manage warehouses, roles, and security policies.<br/><br/><b>Cloud & Integration :</b><br/><br/>- Work across AWS, Azure, or GCP environments for data storage, processing, and orchestration.<br/><br/>- Integrate Snowflake with third-party tools (Airflow, DBT, Kafka, BI tools).<br/><br/><b>Collaboration & Delivery :</b><br/><br/>- Partner with data scientists, BI teams, and business stakeholders to deliver data-driven insights.<br/><br/>- Contribute to client presentations, solution proposals, and business development activities.<br/><br/> <b>What You Bring :</b><br/><br/>- 5+ years of experience in Snowflake-based data engineering with strong Python/PySpark expertise.<br/><br/>- Solid experience with ETL/ELT pipeline design, data modeling, and cloud-native data solutions.<br/><br/>- Hands-on knowledge of AWS, Azure, or GCP cloud services.<br/><br/>- Strong understanding of SQL, performance tuning, and data security best practices.<br/><br/>- Excellent communication, leadership, and stakeholder management skills.<br/><br/>- Analytical mindset with adaptability to fast-paced and evolving environments.<br/><br/> <b>Preferred Qualifications :</b><br/><br/>- Experience with orchestration tools (Airflow, Prefect, DBT).<br/><br/>- Knowledge of CI/CD practices and DevOps for data.<br/><br/>- Exposure to real-time streaming data (Kafka, Kinesis, Pub/Sub).<br/><br/>- Certifications in Snowflake, AWS, Azure, or GCP.</p><br/></p> (ref:hirist.tech)


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Snowflake Data Potential: Insight & Career Growth Guide