Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Data Engineer Google Cloud Platform.
India Jobs Expertini

Urgent! Data Engineer - Google Cloud Platform Job Opening In India, India – Now Hiring Haruto Technologies LLP

Data Engineer Google Cloud Platform



Job description

<p><p><b>Role :</b> GCP Data Engineer<br/><br/><b>Location :</b> Remote<br/><br/><b>Experience Required :</b> 5+ Years<br/><br/><b>About the Role :</b><br/><br/>We are looking for an experienced GCP Data Engineer who has strong expertise in building and optimizing data pipelines, big data processing, and data warehousing solutions on Google Cloud Platform.

The ideal candidate should be hands-on with BigQuery, DataProc, PySpark, Python, and SQL, with proven experience in designing scalable and efficient data solutions.<br/><br/><b>Key Responsibilities :</b><br/><br/>- Design, develop, and maintain data pipelines and ETL workflows on GCP.<br/><br/>- Implement large-scale data processing frameworks using GCP DataProc (Spark/Hadoop ecosystem).<br/><br/>- Work extensively on GCP BigQuery for building data models, optimizing queries, and supporting analytical needs.<br/><br/>- Develop and maintain PySpark jobs for transforming and processing big datasets.<br/><br/>- Write high-performance SQL queries and Python scripts for data processing and automation.<br/><br/>- Optimize data pipelines for scalability, reliability, and performance.<br/><br/>- Ensure data security, governance, and compliance standards are met across projects.<br/><br/>- Collaborate with cross-functional teams (Data Analysts, Architects, and Business Stakeholders) to deliver actionable insights.<br/><br/>- Troubleshoot complex data engineering issues and provide long-term solutions.<br/><br/><b>Required Skills & Experience : </b><br/><br/>- 5- 6 years of hands-on Data Engineering experience.<br/><br/>- Strong expertise in :<br/><br/>1.

GCP DataProc (cluster management, Spark/Hadoop jobs).<br/><br/>2.

GCP BigQuery (data warehouse, analytics, performance tuning).<br/><br/>3.

PySpark for big data processing.<br/><br/>4.

Python (scripting, automation, and data workflows).<br/><br/>5.

SQL (advanced query optimization and data manipulation).<br/><br/>- Solid understanding of data lakes, data warehouses, and cloud-native architectures.<br/><br/>- Experience in ETL/ELT design, orchestration, and pipeline automation.<br/><br/>- Ability to work independently in a remote, client-facing environment.</p><br/></p> (ref:hirist.tech)


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Data Engineer Potential: Insight & Career Growth Guide