Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Data Engineer Google Cloud Platform.
India Jobs Expertini

Urgent! Data Engineer - Google Cloud Platform Job Opening In Panchkula – Now Hiring Mobile Programming LLC

Data Engineer Google Cloud Platform



Job description

<p><b>Job Title : </b>Data Engineer (GCP)<br/><br/><b>Location</b> : Pune, Mumbai, Bangalore, Gurgaon, Panchkula, Mohali<br/><br/><b>Experience</b> : 3+ years<br/><br/><b>Notice period :</b> Immediate joiner<br/><br/><b>Job Description :</b><br/><br/>We are looking for a skilled and passionate Data Engineer with hands-on experience working on Google Cloud Platform (GCP).

The ideal candidate will have expertise in building and maintaining data pipelines, utilizing GCP services, and implementing best practices for data ingestion, storage, and processing.

If you have strong technical proficiency in Python or Java and enjoy solving complex challenges, we would love to hear from you!<br/><br/><b>Key Responsibilities :</b><br/><br/><b></b>GCP Services :<br/><br/>- Design, develop, and manage data pipelines leveraging GCP services such as Google Cloud Storage (GCS), PubSub, Dataflow or DataProc, BigQuery, and Airflow/Composer.<br/><br/>- Work on Python (preferred) or Java-based solutions to build robust, scalable, and efficient data pipelines.<br/><br/>ETL on GCP Cloud :<br/><br/>- Develop and maintain ETL pipelines using Python/Java, ensuring data is processed efficiently and accurately.<br/><br/>- Write efficient scripts, adhere to best practices for data ingestion, transformation, and storage on GCP.<br/><br/>- Troubleshoot and solve data pipeline challenges, ensuring high performance and reliability.<br/><br/>Data Ingestion (Batch and Streaming) :<br/><br/>- Implement batch and streaming data ingestion workflows on GCP.<br/><br/>- Design, optimize, and monitor data pipelines for both batch processing and real-time streaming, ensuring smooth and seamless data flow.<br/><br/>Database Expertise :<br/><br/>- Knowledge of relational (SQL) and non-relational (NoSQL) databases both on-premise and in the cloud.<br/><br/>- Understanding the differences between SQL and NoSQL databases and experience working with at least two types of NoSQL databases.<br/><br/>- Expertise in working with databases on GCP, including BigQuery and others.<br/><br/>Data Warehouse Concepts :<br/><br/>- Apply your knowledge of data warehouse concepts to assist in designing efficient storage and processing solutions for large datasets.<br/><br/>- Familiarity with the principles of data warehousing, from data modeling to ETL processes, is required at a beginner to intermediate level.<br/><br/><b>Technical Skills Required :</b><br/><br/>GCP Services : GCS, PubSub, Dataflow, DataProc, BigQuery, Airflow/Composer | Programming : Python, Java | ETL Development : Data pipelines, Scripting | Data Ingestion : Batch, Streaming | Databases : SQL, NoSQL | Data Warehousing : Data Warehouse Concepts</p> (ref:hirist.tech)


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Data Engineer Potential: Insight & Career Growth Guide