• Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role.
India Jobs Expertini

Lead Consultant-Data Engineer, AWS+Python, Spark, Kafka for ETL Job Opening In Bengaluru – Now Hiring Genpact

Lead Consultant Data Engineer, AWS+Python, Spark, Kafka for ETL

    India Jobs Expertini Expertini India Jobs Bengaluru Other General Lead Consultant Data Engineer, Aws+python, Spark, Kafka For Etl

Job description

Ready to shape the future of work?

At Genpact, we don’t just adapt to change—we drive it.

AI and digital innovation are redefining industries, and we’re leading the charge.

Genpact’s AI Gigafactory, our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale.

From large-scale models to agentic AI, our breakthrough solutions tackle companies’ most complex challenges.

If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment.

Genpact (NYSE: G) is anadvanced technology services and solutions company that deliverslastingvalue for leading enterprisesglobally.Through ourdeep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead.Powered by curiosity, courage, and innovation,our teamsimplementdata, technology, and AItocreate tomorrow, today.Get to know us atgenpact.comand onLinkedIn,X,YouTube, andFacebook.

Inviting applications for the role of Lead Consultant-Data Engineer, AWS+Python, Spark, Kafka for ETL!

Responsibilities

  • Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka.

  • Integrate structured and unstructured data from various data sources into data lakes and data warehouses.

  • Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift)

  • Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness.

  • Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms.

  • Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost

  • Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc.

  • Build data pipelines by building ETL processes (Extract-Transform-Load)

  • Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data.

  • Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs

  • Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements

  • Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems

  • Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security

  • Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way

  • Coordinate with release management, other supporting teams to deploy changes in production environment

  • Qualifications we seek in you!

    Minimum Qualifications

  • Experience in designing, implementing data pipelines, build data applications, data migration on AWS

  • Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift

  • Experience of Databricks will be added advantage

  • Strong experience in Python and SQL

  • Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift.

  • Advanced programming skills in Python for data processing and automation.

  • Hands-on experience with Apache Spark for large-scale data processing.

  •  Experience with Apache Kafka for real-time data streaming and event processing.

  • Proficiency in SQL for data querying and transformation.

  • Strong understanding of security principles and best practices for cloud-based environments.

  • Experience with monitoring tools and implementing proactive measures to ensure system availability and performance.

  • Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment.

  • Strong communication and collaboration skills to work effectively with cross-functional teams.

  • Preferred Qualifications/ Skills

  • Master’s Degree-Computer Science, Electronics, Electrical.

  • AWS Data Engineering & Cloud certifications, Databricks certifications

  • Experience with multiple data integration technologies and cloud platforms

  • Knowledge of Change & Incident Management process

  • Why join Genpact?

  • Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation 

  • Make an impact – Drive change for global enterprises and solve business challenges that matter 

  • Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities 

  • Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day 

  • Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress 

  • Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. 

    Let’s build tomorrow together.

    Required Skill Profession

    Other General


    • Job Details

    Related Jobs

    Mpowerplus hiring AWS Data Engineer - Python/Spark Job in Bengaluru, Karnataka, India
    Mpowerplus
    Bengaluru, Karnataka, India
    NonStop hiring Data Engineer - AWS/ETL/Python Job in Pune, Maharashtra, India
    NonStop
    Pune, Maharashtra, India
    Mastech Digital hiring AWS Data Engineer - ETL/Python Job in Bengaluru, Karnataka, India
    Mastech Digital
    Bengaluru, Karnataka, India
    Genpact hiring Lead Consultant-Data Engineer, AWS/ETL Job in Kolkata, West Bengal, India
    Genpact
    Kolkata, West Bengal, India
    Genpact hiring Lead Consultant-Data Engineer, AWS/ETL Job in Kolkata, West Bengal, India
    Genpact
    Kolkata, West Bengal, India
    EverestDX hiring EverestDX - Data Engineer - ETL/Spark Job in Hyderabad, Telangana, India
    EverestDX
    Hyderabad, Telangana, India
    NETANALYTIKS TECHNOLOGIES LIMITED hiring NetAnalytiks - Lead AWS Data Engineer - ETL/Python Job in India
    NETANALYTIKS TECHNOLOGIES LIMITED
    India
    Anicalls (Pty) Ltd hiring Java/Spark/Kafka Job in Mumbai, Maharashtra, India
    Anicalls (Pty) Ltd
    Mumbai, Maharashtra, India
    Anicalls (Pty) Ltd hiring Java/Spark/Kafka Job in Mumbai, Maharashtra, India
    Anicalls (Pty) Ltd
    Mumbai, Maharashtra, India
    Worksconsultancy hiring Data Engineer - Spark/Python Job in Bengaluru, Karnataka, India
    Worksconsultancy
    Bengaluru, Karnataka, India
    Provido Solutions hiring Data Engineer - Python/Spark Job in India
    Provido Solutions
    India
    Varite hiring Data Engineer - Python/Spark Job in Hyderabad, Telangana, India
    Varite
    Hyderabad, Telangana, India
    Varite hiring Data Engineer - Python/Spark Job in Bengaluru, Karnataka, India
    Varite
    Bengaluru, Karnataka, India
    MNR Solutions hiring Data Engineer - Python/Spark Job in Bengaluru, Karnataka, India
    MNR Solutions
    Bengaluru, Karnataka, India
    Confidential hiring Python, AWS & ETL Job in India
    Confidential
    India
    JPMorgan Chase & Co. hiring Software Engineer II- Spark,AWS,Java,Kafka Job in Hyderabad, Telangana, India
    JPMorgan Chase & Co.
    Hyderabad, Telangana, India
    Deqode hiring AWS Data Engineer - Python/SQL/ETL Job in Chennai, Tamil Nadu, India
    Deqode
    Chennai, Tamil Nadu, India
    Deqode hiring AWS Data Engineer - Python/SQL/ETL Job in Gurugram, Haryana, India
    Deqode
    Gurugram, Haryana, India
    DATAZYMES ANALYTICS PRIVATE LIMITED hiring DataZymes - AWS Data Engineer - Python/ETL Job in Bengaluru, Karnataka, India
    DATAZYMES ANALYTICS PRIVATE LIMITED
    Bengaluru, Karnataka, India

    Unlock Your Lead Consultant Potential: Insight & Career Growth Guide


    Real-time Lead Consultant Jobs Trends (Graphical Representation)

    Explore profound insights with Expertini's real-time, in-depth analysis, showcased through the graph here. Uncover the dynamic job market trends for Lead Consultant in Bengaluru, India, highlighting market share and opportunities for professionals in Lead Consultant roles.

    351690 Jobs in India
    351690
    26742 Jobs in Bengaluru
    26742
    Download Lead Consultant Jobs Trends in Bengaluru and India

    Are You Looking for Lead Consultant Data Engineer, AWS+Python, Spark, Kafka for ETL Job?

    Great news! is currently hiring and seeking a Lead Consultant Data Engineer, AWS+Python, Spark, Kafka for ETL to join their team. Feel free to download the job details.

    Wait no longer! Are you also interested in exploring similar jobs? Search now: .

    The Work Culture

    An organization's rules and standards set how people should be treated in the office and how different situations should be handled. The work culture at Genpact adheres to the cultural norms as outlined by Expertini.

    The fundamental ethical values are:

    1. Independence

    2. Loyalty

    3. Impartiapty

    4. Integrity

    5. Accountabipty

    6. Respect for human rights

    7. Obeying India laws and regulations

    What Is the Average Salary Range for Lead Consultant Data Engineer, AWS+Python, Spark, Kafka for ETL Positions?

    The average salary range for a varies, but the pay scale is rated "Standard" in Bengaluru. Salary levels may vary depending on your industry, experience, and skills. It's essential to research and negotiate effectively. We advise reading the full job specification before proceeding with the application to understand the salary package.

    What Are the Key Qualifications for Lead Consultant Data Engineer, AWS+Python, Spark, Kafka for ETL?

    Key qualifications for Lead Consultant Data Engineer, AWS+Python, Spark, Kafka for ETL typically include Other General and a list of qualifications and expertise as mentioned in the job specification. The generic skills are mostly outlined by the . Be sure to check the specific job listing for detailed requirements and qualifications.

    How Can I Improve My Chances of Getting Hired for Lead Consultant Data Engineer, AWS+Python, Spark, Kafka for ETL?

    To improve your chances of getting hired for Lead Consultant Data Engineer, AWS+Python, Spark, Kafka for ETL, consider enhancing your skills. Check your CV/Résumé Score with our free Tool. We have an in-built Resume Scoring tool that gives you the matching score for each job based on your CV/Résumé once it is uploaded. This can help you align your CV/Résumé according to the job requirements and enhance your skills if needed.

    Interview Tips for Lead Consultant Data Engineer, AWS+Python, Spark, Kafka for ETL Job Success

    Genpact interview tips for Lead Consultant Data Engineer, AWS+Python, Spark, Kafka for ETL

    Here are some tips to help you prepare for and ace your Lead Consultant Data Engineer, AWS+Python, Spark, Kafka for ETL job interview:

    Before the Interview:

    Research: Learn about the Genpact's mission, values, products, and the specific job requirements and get further information about

    Other Openings

    Practice: Prepare answers to common interview questions and rehearse using the STAR method (Situation, Task, Action, Result) to showcase your skills and experiences.

    Dress Professionally: Choose attire appropriate for the company culture.

    Prepare Questions: Show your interest by having thoughtful questions for the interviewer.

    Plan Your Commute: Allow ample time to arrive on time and avoid feeling rushed.

    During the Interview:

    Be Punctual: Arrive on time to demonstrate professionalism and respect.

    Make a Great First Impression: Greet the interviewer with a handshake, smile, and eye contact.

    Confidence and Enthusiasm: Project a positive attitude and show your genuine interest in the opportunity.

    Answer Thoughtfully: Listen carefully, take a moment to formulate clear and concise responses. Highlight relevant skills and experiences using the STAR method.

    Ask Prepared Questions: Demonstrate curiosity and engagement with the role and company.

    Follow Up: Send a thank-you email to the interviewer within 24 hours.

    Additional Tips:

    Be Yourself: Let your personality shine through while maintaining professionalism.

    Be Honest: Don't exaggerate your skills or experience.

    Be Positive: Focus on your strengths and accomplishments.

    Body Language: Maintain good posture, avoid fidgeting, and make eye contact.

    Turn Off Phone: Avoid distractions during the interview.

    Final Thought:

    To prepare for your Lead Consultant Data Engineer, AWS+Python, Spark, Kafka for ETL interview at Genpact, research the company, understand the job requirements, and practice common interview questions.

    Highlight your leadership skills, achievements, and strategic thinking abilities. Be prepared to discuss your experience with HR, including your approach to meeting targets as a team player. Additionally, review the Genpact's products or services and be prepared to discuss how you can contribute to their success.

    By following these tips, you can increase your chances of making a positive impression and landing the job!

    How to Set Up Job Alerts for Lead Consultant Data Engineer, AWS+Python, Spark, Kafka for ETL Positions

    Setting up job alerts for Lead Consultant Data Engineer, AWS+Python, Spark, Kafka for ETL is easy with India Jobs Expertini. Simply visit our job alerts page here, enter your preferred job title and location, and choose how often you want to receive notifications. You'll get the latest job openings sent directly to your email for FREE!