• Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role.
India Jobs Expertini

Python/Pyspark Developer - Hadoop/Spark Job Opening In India, India – Now Hiring DG Liger Consulting


Job description

<p>Location: Gurgaon ( Work from office)</p><p><br/></p><p>We are looking for a Python PySpark Developer with 34 years of experience, primarily focused on Python programming and automation using GitHub Actions.

The ideal candidate will be responsible for developing scalable Python-based data workflows, writing PySpark scripts for large-scale data processing, and implementing continuous integration and deployment pipelines through GitHub Actions.</p><p><br/></p><p>- Develop, maintain, and optimize Python-based applications and PySpark jobs for data processing.</p><p><br/></p><p>- Automate build, test, and deployment processes using GitHub Actions.</p><p><br/></p><p>- Write reusable, efficient, and testable Python code following best coding practices.</p><p><br/></p><p>- Collaborate with data engineers and DevOps teams to integrate and automate workflows.</p><p><br/></p><p>- Implement unit testing, code reviews, and CI/CD pipelines for continuous delivery.</p><p><br/></p><p>- Troubleshoot and resolve issues in existing automation and data workflows.</p><p><br/></p><p>- Ensure version control, code quality, and environment consistency across projects.</p><p><br/></p><p>- Bachelors degree in Computer Science, Information Technology, or a related field.</p><p><br/></p><p>- 3-4 years of experience in Python development (with a focus on backend or data processing).</p><p><br/></p><p>- Hands-on experience in PySpark for distributed data processing and transformation.</p><p><br/></p><p>- Proficiency in setting up and managing GitHub Actions workflows (build, test, deploy pipelines).</p><p><br/></p><p>- Strong understanding of CI/CD principles and version control systems (Git).</p><p><br/></p><p>- Good knowledge of object-oriented programming, modular code design, and debugging techniques.</p><p><br/></p><p>- Experience working with virtual environments and package management (pip, poetry, conda).</p><p><br/></p><p>- Familiarity with logging, monitoring, and exception handling in Python applications.</p><p><br/></p><p>- Spark, Hadoop, Snowflake DB, PySpark, SQL/PLSQL</p><p></p> (ref:hirist.tech)

Required Skill Profession

Computer Occupations


  • Job Details

Related Jobs

Confidential hiring spark,python(pyspark developer) Job in India
Confidential
India
Hashone Careers hiring Big Data Developer - Spark/Hadoop Job in Pune, Maharashtra, India
Hashone Careers
Pune, Maharashtra, India
Hashone Careers hiring Big Data Developer - Spark/Hadoop Job in Hyderabad, Telangana, India
Hashone Careers
Hyderabad, Telangana, India
Confidential hiring Big Data Developer - Spark/Hadoop Job in Bengaluru, Karnataka, India
Confidential
Bengaluru, Karnataka, India
Hashone Careers hiring Big Data Developer - Spark/Hadoop Job in Bengaluru, Karnataka, India
Hashone Careers
Bengaluru, Karnataka, India
Lancesoft India Pvt Ltd hiring Data Engineer - Spark/Hadoop Job in India
Lancesoft India Pvt Ltd
India
NTT hiring Python/Hadoop Developer Job in Hyderabad, Telangana, India
NTT
Hyderabad, Telangana, India
Tata Consultancy Services hiring Data Engineer - Hadoop/PySpark Job in India
Tata Consultancy Services
India
Info Origin Inc hiring Senior Big Data Developer - Spark/Hadoop Job in Bengaluru, Karnataka, India
Info Origin Inc
Bengaluru, Karnataka, India
Impacteers hiring Big Data Engineer - Hadoop/Spark Job in Hyderabad, Telangana, India
Impacteers
Hyderabad, Telangana, India
Impacteers hiring Big Data Engineer - Hadoop/Spark Job in Bengaluru, Karnataka, India
Impacteers
Bengaluru, Karnataka, India
Confidential hiring Big Data Engineer - Spark/Hadoop Job in Coimbatore, Tamil Nadu, India
Confidential
Coimbatore, Tamil Nadu, India
Anicalls (Pty) Ltd hiring Sr. Spark/ Hadoop/Scala Engineer... Job in Bengaluru, Karnataka, India
Anicalls (Pty) Ltd
Bengaluru, Karnataka, India
Anicalls (Pty) Ltd hiring Sr. Hadoop Spark / Scala Engineer Job in Noida, Uttar Pradesh, India
Anicalls (Pty) Ltd
Noida, Uttar Pradesh, India
Consulting Pandits hiring Big Data Architect - Spark/Hadoop Job in Pune, Maharashtra, India
Consulting Pandits
Pune, Maharashtra, India
Techmora hiring Senior Data Engineer - Spark/Hadoop Job in Bengaluru, Karnataka, India
Techmora
Bengaluru, Karnataka, India
Impacteers hiring Big Data Engineer - Hadoop/Spark Job in Chennai, Tamil Nadu, India
Impacteers
Chennai, Tamil Nadu, India
Novo Tree Minds Consulting hiring Big Data Engineer - Spark/Hadoop Job in Mumbai, Maharashtra, India
Novo Tree Minds Consulting
Mumbai, Maharashtra, India
Techmora hiring Senior Data Engineer - Spark/Hadoop Job in Gurugram, Haryana, India
Techmora
Gurugram, Haryana, India
Techmora hiring Senior Data Engineer - Spark/Hadoop Job in Gurugram, Haryana, India
Techmora
Gurugram, Haryana, India
IT Firm hiring Big Data Engineer - Hadoop/Spark Job in Bengaluru, Karnataka, India
IT Firm
Bengaluru, Karnataka, India
RSquareSoft Technologies hiring Python Developer (PySpark) Job in Pune, Maharashtra, India
RSquareSoft Technologies
Pune, Maharashtra, India
Confidential hiring Pyspark/ Python developer Job in India
Confidential
India
RSquareSoft Technologies hiring Python Developer (PySpark) Job in Pune, Maharashtra, India
RSquareSoft Technologies
Pune, Maharashtra, India
RSquareSoft Technologies hiring Python Developer (PySpark) Job in Pune, Maharashtra, India
RSquareSoft Technologies
Pune, Maharashtra, India
RSquareSoft Technologies hiring Python Developer (Pyspark) Job in India
RSquareSoft Technologies
India
RSquareSoft Technologies hiring Python Developer (PySpark) Job in New Delhi, Delhi, India
RSquareSoft Technologies
New Delhi, Delhi, India
RSquareSoft Technologies hiring Python Developer (PySpark) Job in Pune, Maharashtra, India
RSquareSoft Technologies
Pune, Maharashtra, India
RSquareSoft Technologies hiring Python Developer (PySpark) Job in India
RSquareSoft Technologies
India
RSquareSoft Technologies hiring Python Developer (PySpark) Job in New Delhi, New Delhi, India
RSquareSoft Technologies
New Delhi, New Delhi, India
RSquareSoft Technologies hiring Python Developer (PySpark) Job in Pune, Maharashtra, India
RSquareSoft Technologies
Pune, Maharashtra, India

Unlock Your Python Pyspark Potential: Insight & Career Growth Guide


Real-time Python Pyspark Jobs Trends (Graphical Representation)

Explore profound insights with Expertini's real-time, in-depth analysis, showcased through the graph here. Uncover the dynamic job market trends for Python Pyspark in India, India, highlighting market share and opportunities for professionals in Python Pyspark roles.

65767 Jobs in India
65767
3677 Jobs in India
3677
Download Python Pyspark Jobs Trends in India and India

Are You Looking for Python/Pyspark Developer Hadoop/Spark Job?

Great news! is currently hiring and seeking a Python/Pyspark Developer Hadoop/Spark to join their team. Feel free to download the job details.

Wait no longer! Are you also interested in exploring similar jobs? Search now: .

The Work Culture

An organization's rules and standards set how people should be treated in the office and how different situations should be handled. The work culture at DG Liger Consulting adheres to the cultural norms as outlined by Expertini.

The fundamental ethical values are:

1. Independence

2. Loyalty

3. Impartiapty

4. Integrity

5. Accountabipty

6. Respect for human rights

7. Obeying India laws and regulations

What Is the Average Salary Range for Python/Pyspark Developer Hadoop/Spark Positions?

The average salary range for a varies, but the pay scale is rated "Standard" in India. Salary levels may vary depending on your industry, experience, and skills. It's essential to research and negotiate effectively. We advise reading the full job specification before proceeding with the application to understand the salary package.

What Are the Key Qualifications for Python/Pyspark Developer Hadoop/Spark?

Key qualifications for Python/Pyspark Developer Hadoop/Spark typically include Computer Occupations and a list of qualifications and expertise as mentioned in the job specification. The generic skills are mostly outlined by the . Be sure to check the specific job listing for detailed requirements and qualifications.

How Can I Improve My Chances of Getting Hired for Python/Pyspark Developer Hadoop/Spark?

To improve your chances of getting hired for Python/Pyspark Developer Hadoop/Spark, consider enhancing your skills. Check your CV/Résumé Score with our free Tool. We have an in-built Resume Scoring tool that gives you the matching score for each job based on your CV/Résumé once it is uploaded. This can help you align your CV/Résumé according to the job requirements and enhance your skills if needed.

Interview Tips for Python/Pyspark Developer Hadoop/Spark Job Success

DG Liger Consulting interview tips for Python/Pyspark Developer   Hadoop/Spark

Here are some tips to help you prepare for and ace your Python/Pyspark Developer Hadoop/Spark job interview:

Before the Interview:

Research: Learn about the DG Liger Consulting's mission, values, products, and the specific job requirements and get further information about

Other Openings

Practice: Prepare answers to common interview questions and rehearse using the STAR method (Situation, Task, Action, Result) to showcase your skills and experiences.

Dress Professionally: Choose attire appropriate for the company culture.

Prepare Questions: Show your interest by having thoughtful questions for the interviewer.

Plan Your Commute: Allow ample time to arrive on time and avoid feeling rushed.

During the Interview:

Be Punctual: Arrive on time to demonstrate professionalism and respect.

Make a Great First Impression: Greet the interviewer with a handshake, smile, and eye contact.

Confidence and Enthusiasm: Project a positive attitude and show your genuine interest in the opportunity.

Answer Thoughtfully: Listen carefully, take a moment to formulate clear and concise responses. Highlight relevant skills and experiences using the STAR method.

Ask Prepared Questions: Demonstrate curiosity and engagement with the role and company.

Follow Up: Send a thank-you email to the interviewer within 24 hours.

Additional Tips:

Be Yourself: Let your personality shine through while maintaining professionalism.

Be Honest: Don't exaggerate your skills or experience.

Be Positive: Focus on your strengths and accomplishments.

Body Language: Maintain good posture, avoid fidgeting, and make eye contact.

Turn Off Phone: Avoid distractions during the interview.

Final Thought:

To prepare for your Python/Pyspark Developer Hadoop/Spark interview at DG Liger Consulting, research the company, understand the job requirements, and practice common interview questions.

Highlight your leadership skills, achievements, and strategic thinking abilities. Be prepared to discuss your experience with HR, including your approach to meeting targets as a team player. Additionally, review the DG Liger Consulting's products or services and be prepared to discuss how you can contribute to their success.

By following these tips, you can increase your chances of making a positive impression and landing the job!

How to Set Up Job Alerts for Python/Pyspark Developer Hadoop/Spark Positions

Setting up job alerts for Python/Pyspark Developer Hadoop/Spark is easy with India Jobs Expertini. Simply visit our job alerts page here, enter your preferred job title and location, and choose how often you want to receive notifications. You'll get the latest job openings sent directly to your email for FREE!