Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Big Data Engineer (Spark | AWS | Scala/Java).
India Jobs Expertini

Urgent! Big Data Engineer (Spark | AWS | Scala/Java) Job Opening In New Delhi – Now Hiring Sky Systems, Inc. (SkySys)

Big Data Engineer (Spark | AWS | Scala/Java)



Job description

Role:

Big Data Engineer Position Type:

Full-Time Contract (40hrs/week) Contract Duration:

Long Term Work Schedule : 8 hours/day (Mon-Fri) Location : Hybrid - Hyderabad, India (3x days onsite/week)

We are looking for a skilled

Big Data Engineer (5+ years’ experience)

to join Experian’s dynamic data engineering team.

You will design and optimize scalable data pipelines, APIs, and big data solutions on AWS to support our global data ecosystem.

Key Responsibilities: Build and maintain scalable data pipelines with

Spark/PySpark, Hadoop, and AWS (EMR, EC2, ECS) .

Develop and integrate

RESTful APIs

and AWS Gateway services using

Scala/Java .

Ensure

data quality, performance, and reliability

across platforms.

Collaborate with cross-functional teams and support

CI/CD deployment

processes.

Must-Haves: 5+ years of experience in Big Data Engineering.

Strong skills in

Spark/PySpark, Hadoop, AWS (EMR, EC2, ECS) .

Proficiency in

Scala/Java

for backend/API development.

Solid understanding of

RESTful APIs

and Git/CI/CD pipelines.

Nice-to-Haves: Experience in

data security, real-time processing, and data lake architectures .

Familiarity with

Docker/Kubernetes .


Required Skill Profession

Prb



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Big Data Potential: Insight & Career Growth Guide