Job Overview
            
                
                
                
                    Category
                    Computer Occupations
                 
                
             
            
            
         
        
            Ready to Apply?
            
                Take the Next Step in Your Career
                Join Impetus and advance your career in Computer Occupations
             
            Apply for This Position
            
                Click the button above to apply on our website
            
         
        
            Job Description
            
                Job Title:
GCP Data Engineer
Experience:
4–7 Years
Location:
Bangalore / Gurgaon
Employment Type:
Full-Time
About the Role
We are looking for an experienced
GCP Data Engineer
with a strong background in
Big Data, PySpark, and Python , and hands-on experience with core
Google Cloud Platform (GCP)
services.
The ideal candidate will be responsible for designing, building, and optimizing scalable data pipelines and analytics solutions that drive business insights.
Key Responsibilities
Design, develop, and maintain scalable and reliable
data pipelines
and
ETL workflows
using
PySpark ,
Python , and GCP native tools.
Work with large-scale datasets on
BigQuery ,
Dataproc ,
Dataflow ,
Pub/Sub , and
Cloud Storage .
Collaborate with data architects, analysts, and business stakeholders to define data models, transformation logic, and performance optimization strategies.
Implement best practices for data quality, data governance, and security on GCP.
Monitor and troubleshoot production data pipelines to ensure reliability and performance.
Optimize data workflows for cost efficiency and scalability in a cloud environment.
Integrate data from multiple sources, both batch and streaming, into centralized analytical platforms.
Required Skills & Experience
4–7 years
of hands-on experience as a
Data Engineer
in large-scale data environments.
Strong expertise in
Google Cloud Platform (GCP)
— including
BigQuery, Dataproc, Dataflow, Pub/Sub, Cloud Storage, Composer , etc.
Proven experience with
Big Data technologies
and distributed data processing using
PySpark
and
Spark SQL .
Strong programming skills in
Python
for data processing and automation.
Solid understanding of
ETL design patterns ,
data warehousing , and
data modeling concepts .
Experience with
Airflow
or other workflow orchestration tools.
Strong debugging, performance tuning, and problem-solving skills.
            
         
  
  
  
        
        
        
        
        
            Don't Miss This Opportunity!
            
                Impetus is actively hiring for this GCP Data Engineer position
            
            Apply Now