Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Python/Pyspark Developer Hadoop/Spark.
India Jobs Expertini

Urgent! Python/Pyspark Developer - Hadoop/Spark Job Opening In India, India – Now Hiring DG Liger Consulting

Python/Pyspark Developer Hadoop/Spark



Job description

<p>Location: Gurgaon ( Work from office)</p><p><br/></p><p>We are looking for a Python PySpark Developer with 34 years of experience, primarily focused on Python programming and automation using GitHub Actions.

The ideal candidate will be responsible for developing scalable Python-based data workflows, writing PySpark scripts for large-scale data processing, and implementing continuous integration and deployment pipelines through GitHub Actions.</p><p><br/></p><p>- Develop, maintain, and optimize Python-based applications and PySpark jobs for data processing.</p><p><br/></p><p>- Automate build, test, and deployment processes using GitHub Actions.</p><p><br/></p><p>- Write reusable, efficient, and testable Python code following best coding practices.</p><p><br/></p><p>- Collaborate with data engineers and DevOps teams to integrate and automate workflows.</p><p><br/></p><p>- Implement unit testing, code reviews, and CI/CD pipelines for continuous delivery.</p><p><br/></p><p>- Troubleshoot and resolve issues in existing automation and data workflows.</p><p><br/></p><p>- Ensure version control, code quality, and environment consistency across projects.</p><p><br/></p><p>- Bachelors degree in Computer Science, Information Technology, or a related field.</p><p><br/></p><p>- 3-4 years of experience in Python development (with a focus on backend or data processing).</p><p><br/></p><p>- Hands-on experience in PySpark for distributed data processing and transformation.</p><p><br/></p><p>- Proficiency in setting up and managing GitHub Actions workflows (build, test, deploy pipelines).</p><p><br/></p><p>- Strong understanding of CI/CD principles and version control systems (Git).</p><p><br/></p><p>- Good knowledge of object-oriented programming, modular code design, and debugging techniques.</p><p><br/></p><p>- Experience working with virtual environments and package management (pip, poetry, conda).</p><p><br/></p><p>- Familiarity with logging, monitoring, and exception handling in Python applications.</p><p><br/></p><p>- Spark, Hadoop, Snowflake DB, PySpark, SQL/PLSQL</p><p></p> (ref:hirist.tech)


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Python Pyspark Potential: Insight & Career Growth Guide