Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Apache Spark / Scala.
India Jobs Expertini

Urgent! Apache Spark / Scala Job Opening In Bengaluru – Now Hiring Confidential

Apache Spark / Scala



Job description

Responsibilities:

  • Develop and maintain data processing workflows using Apache Spark and Scala
  • Implement batch and streaming data pipelines
  • Optimize Spark jobs for better performance and scalability
  • Collaborate with data engineers and analysts to deliver data solutions
  • Debug and resolve issues in production big data environments
  • Integrate with data storage systems like HDFS, Kafka, and NoSQL databases
  • Write clean, maintainable code with best practices in mind

Required Skills:

  • Strong programming skills in Scala and hands-on experience with Apache Spark
  • Knowledge of Spark Core, Spark SQL, Spark Streaming, and MLlib
  • Experience with Hadoop ecosystem components (HDFS, Hive, Kafka)
  • Familiarity with functional programming concepts
  • Experience with data serialization formats (Parquet, Avro, ORC)
  • Version control (Git) and CI/CD understanding
  • Good problem-solving and communication skills


Skills Required
Spark Core, Spark SQL, Spark Streaming, hdfs , Hive, Kafka


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Apache Spark Potential: Insight & Career Growth Guide