Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Big Data Pipeline Architect (GCP).
India Jobs Expertini

Urgent! Big Data Pipeline Architect (GCP) Job Opening In Bengaluru – Now Hiring Impetus

Big Data Pipeline Architect (GCP)



Job description

Job Title: GCP Data Engineer

Experience: 4–7 Years

Location: Bangalore / Gurgaon

Employment Type: Full-Time

About the Role

We are looking for an experienced GCP Data Engineer with a strong background in Big Data, PySpark, and Python, and hands-on experience with core Google Cloud Platform (GCP) services.

The ideal candidate will be responsible for designing, building, and optimizing scalable data pipelines and analytics solutions that drive business insights.

Key Responsibilities

  • Design, develop, and maintain scalable and reliable data pipelines and ETL workflows using PySpark, Python, and GCP native tools.
  • Work with large-scale datasets on BigQuery, Dataproc, Dataflow, Pub/Sub, and Cloud Storage.
  • Collaborate with data architects, analysts, and business stakeholders to define data models, transformation logic, and performance optimization strategies.
  • Implement best practices for data quality, data governance, and security on GCP.
  • Monitor and troubleshoot production data pipelines to ensure reliability and performance.
  • Optimize data workflows for cost efficiency and scalability in a cloud environment.
  • Integrate data from multiple sources, both batch and streaming, into centralized analytical platforms.

Required Skills & Experience

  • 4–7 years of hands-on experience as a Data Engineer in large-scale data environments.
  • Strong expertise in Google Cloud Platform (GCP) — including BigQuery, Dataproc, Dataflow, Pub/Sub, Cloud Storage, Composer, etc.
  • Proven experience with Big Data technologies and distributed data processing using PySpark and Spark SQL.
  • Strong programming skills in Python for data processing and automation.
  • Solid understanding of ETL design patterns, data warehousing, and data modeling concepts.
  • Experience with Airflow or other workflow orchestration tools.
  • Strong debugging, performance tuning, and problem-solving skills.


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Big Data Potential: Insight & Career Growth Guide