- Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: GCP and Python data engineer.
Urgent! GCP and Python data engineer Job Opening In India, India – Now Hiring People Prime Worldwide
Important Note (Please Read Before Applying)
Do NOT apply if:
- You have less than 10 years of experience
- You do not have hands-on GCP 5+ years of experience
- You are on a notice period longer than 15 days
- You are looking for remote only
- You are a fresher or unrelated background (e.g., support, testing only, non-Java roles)
✅ Apply ONLY if you meet ALL criteria above.
Random / irrelevant applications will not be processed.
✅ Apply ONLY if you meet ALL criteria above.
Random / irrelevant applications will not be processed.
Job Title: GCP and Python Data Engineer
Location:Bengaluru
Experience: 10+ years
Employment Type: Permanent
Notice Period: Immediate to 15 days Joiners only.
About the Company
Our client is a trusted global innovator of IT and business services, present in 10+ countries.
They specialize in digital & IT modernization, consulting, managed services, and industry-specific solutions.
With a commitment to long-term success, they empower clients and society to move confidently into the digital future.
Job description
Job Overview:
We are looking for a skilled and motivated Lead Data Engineer with strong
experience in Python programming and Google Cloud Platform (GCP) to join our data
engineering team.
The ideal candidate will be responsible for requirements gathering,
designing, architecting the solution, developing, and maintaining robust and scalable
ETL (Extract, Transform, Load) & ELT data pipelines.
The role involves working with
customers directly, gathering requirements, discovery phase, designing, architecting
the solution, using various GCP services, implementing data transformations, data
ingestion, data quality, and consistency across systems, and post post-delivery
support.
Experience Level:
10 to 12 years of relevant IT experience
Key Responsibilities:
● Design, develop, test, and maintain scalable ETL data pipelines using Python.
● Architect the enterprise solutions with various technologies like Kafka,
multi-cloud services, auto-scaling using GKE, Load balancers, APIGEE proxy API
management, DBT, using LLMs as needed in the solution, redaction of sensitive
information, DLP (Data Loss Prevention) etc.
● Work extensively on Google Cloud Platform (GCP) services such as:
○ Dataflow for real-time and batch data processing
○ Cloud Functions for lightweight serverless compute
○ BigQuery for data warehousing and analytics
○ Cloud Composer for orchestration of data workflows (on Apache Airflow)
○ Google Cloud Storage (GCS) for managing data at scale
○ IAM for access control and security
○ Cloud Run for containerized applications
Should have experience in the following areas :
○ API framework: Python FastAPI
○ Processing engine: Apache Spark
○ Messaging and streaming data processing: Kafka
○ Storage: MongoDB, Redis/Bigtable
○ Orchestration: Airflow
○ Experience in deployments in GKE, Cloud Run.
● Perform data ingestion from various sources and apply transformation and
cleansing logic to ensure high-quality data delivery.
● Implement and enforce data quality checks, validation rules, and monitoring.
● Collaborate with data scientists, analysts, and other engineering teams to
understand data needs and deliver efficient data solutions.● Manage version control using GitHub and participate in CI/CD pipeline
deployments for data projects.
● Write complex SQL queries for data extraction and validation from relational
databases such as SQL Server, Oracle, or PostgreSQL.
● Document pipeline designs, data flow diagrams, and operational support
procedures.
Required Skills:
● 10 to 12 years of hands-on experience in Python for backend or data engineering
projects.
● Strong understanding and working experience with GCP cloud services
(especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
● Solid understanding of data pipeline architecture, data integration, and
transformation techniques.
● Experience in working with version control systems like GitHub and knowledge of
CI/CD practices.
● Experience in Apache Spark, Kafka, Redis, Fast APIs, Airflow, GCP Composer DAGs.
● Strong experience in SQL with at least one enterprise database (SQL Server,
Oracle, PostgreSQL, etc.).
● Experience in data migrations from on-premise data sources to Cloud platforms.
Good to Have (Optional Skills):
● Experience working with the Snowflake cloud data platform.
● Hands-on knowledge of Databricks for big data processing and analytics.
● Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools.
Additional Details:
● Excellent problem-solving and analytical skills.
● Strong communication skills and ability to collaborate in a team environment.
Education:
● Bachelor's degree in Computer Science, a related field, or equivalent experience
✨ Smart • Intelligent • Private • Secure
Practice for Any Interview Q&A (AI Enabled)
Predict interview Q&A (AI Supported)
Mock interview trainer (AI Supported)
Ace behavioral interviews (AI Powered)
Record interview questions (Confidential)
Master your interviews
Track your answers (Confidential)
Schedule your applications (Confidential)
Create perfect cover letters (AI Supported)
Analyze your resume (NLP Supported)
ATS compatibility check (AI Supported)
Optimize your applications (AI Supported)
O*NET Supported
O*NET Supported
O*NET Supported
O*NET Supported
O*NET Supported
European Union Recommended
Institution Recommended
Institution Recommended
Researcher Recommended
IT Savvy Recommended
Trades Recommended
O*NET Supported
Artist Recommended
Researchers Recommended
Create your account
Access your account
Create your professional profile
Preview your profile
Your saved opportunities
Reviews you've given
Companies you follow
Discover employers
O*NET Supported
Common questions answered
Help for job seekers
How matching works
Customized job suggestions
Fast application process
Manage alert settings
Understanding alerts
How we match resumes
Professional branding guide
Increase your visibility
Get verified status
Learn about our AI
How ATS ranks you
AI-powered matching
Join thousands of professionals who've advanced their careers with our platform
Unlock Your GCP and Potential: Insight & Career Growth Guide
Real-time GCP and Jobs Trends in India, India (Graphical Representation)
Explore profound insights with Expertini's real-time, in-depth analysis, showcased through the graph below. This graph displays the job market trends for GCP and in India, India using a bar chart to represent the number of jobs available and a trend line to illustrate the trend over time. Specifically, the graph shows 75497 jobs in India and 7188 jobs in India. This comprehensive analysis highlights market share and opportunities for professionals in GCP and roles. These dynamic trends provide a better understanding of the job market landscape in these regions.
Great news! People Prime Worldwide is currently hiring and seeking a GCP and Python data engineer to join their team. Feel free to download the job details.
Wait no longer! Are you also interested in exploring similar jobs? Search now: GCP and Python data engineer Jobs India.
An organization's rules and standards set how people should be treated in the office and how different situations should be handled. The work culture at People Prime Worldwide adheres to the cultural norms as outlined by Expertini.
The fundamental ethical values are:The average salary range for a GCP and Python data engineer Jobs India varies, but the pay scale is rated "Standard" in India. Salary levels may vary depending on your industry, experience, and skills. It's essential to research and negotiate effectively. We advise reading the full job specification before proceeding with the application to understand the salary package.
Key qualifications for GCP and Python data engineer typically include Computer Occupations and a list of qualifications and expertise as mentioned in the job specification. Be sure to check the specific job listing for detailed requirements and qualifications.
To improve your chances of getting hired for GCP and Python data engineer, consider enhancing your skills. Check your CV/Résumé Score with our free Resume Scoring Tool. We have an in-built Resume Scoring tool that gives you the matching score for each job based on your CV/Résumé once it is uploaded. This can help you align your CV/Résumé according to the job requirements and enhance your skills if needed.
Here are some tips to help you prepare for and ace your job interview:
Before the Interview:To prepare for your GCP and Python data engineer interview at People Prime Worldwide, research the company, understand the job requirements, and practice common interview questions.
Highlight your leadership skills, achievements, and strategic thinking abilities. Be prepared to discuss your experience with HR, including your approach to meeting targets as a team player. Additionally, review the People Prime Worldwide's products or services and be prepared to discuss how you can contribute to their success.
By following these tips, you can increase your chances of making a positive impression and landing the job!
Setting up job alerts for GCP and Python data engineer is easy with India Jobs Expertini. Simply visit our job alerts page here, enter your preferred job title and location, and choose how often you want to receive notifications. You'll get the latest job openings sent directly to your email for FREE!