Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: GCP Data Engineer.
India Jobs Expertini

Urgent! GCP Data Engineer Job Opening In Bangalore – Now Hiring Aqilea (formerly Soltia)

GCP Data Engineer



Job description

We are a consulting company with a bunch of technology-interested and happy people!

We love technology, we love design and we love quality.

Our diversity makes us unique and creates an inclusive and welcoming workplace where each individual is highly valued.

With us, each individual is her/himself and respects others for who they are and we believe that when a fantastic mix of people gather and share their knowledge, experiences and ideas, we can help our customers on a completely different level.

We are looking for you who want to grow with us!

With us, you have great opportunities to take real steps in your career and the opportunity to take great responsibility.

We are seeking a skilled and forward-thinking Data Engineer to join our Emerging Tech team.

Company : Aqilea India

Employment Type: Full Time

Location: Bangalore(Hybrid)

Experience: 4.5 to 7 years

About the Role:

We are seeking a skilled Data Engineer with strong expertise in building and optimizing data pipelines, managing large-scale datasets, and enabling data-driven decision-making.

The ideal candidate should have solid experience in SQL, Python, and cloud-based data platforms, with a focus on Google Cloud Platform (GCP).

Key Responsibilities:

  • Design, build, and maintain scalable and efficient data pipelines.
  • Develop and optimize data models in GCP BigQuery.
  • Work with Apache Spark / PySpark for large-scale data processing.
  • Implement and manage data workflows using Airflow and dbt.
  • Collaborate with analysts, data scientists, and business stakeholders to deliver high-quality data solutions.
  • Ensure data quality, reliability, and compliance across platforms.
  • Monitor and optimize performance of data pipelines and queries.

Required Skills & Experience:

  • Strong proficiency in SQL and Python.
  • Hands-on experience with GCP BigQuery for data warehousing.
  • Expertise in Apache Spark / PySpark for data transformations.
  • Experience with dbt for data modeling and transformation.
  • Knowledge of Apache Airflow for orchestration and scheduling.
  • Solid understanding of data engineering best practices, performance optimization, and data governance.
  • Experience working in agile, collaborative environments.
  • Candidates from Tier 1 Institutes(IITs, NITs, IIITs, Bits Pilani)

Notice Period: Immediate to 15 Days Only

Work Location : Bangalore(Hybrid)

Form of employment: Full-time until further notice, we apply 6 months probationary employment.

We interview candidates on an ongoing basis, do not wait to submit your application.


Required Skill Profession

Other General



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your GCP Data Potential: Insight & Career Growth Guide