Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Spark & Scala Developer.
India Jobs Expertini

Urgent! Spark & Scala Developer Job Opening In India, India – Now Hiring Essex Company

Spark & Scala Developer



Job description

<p>Job Description :<br/><br/>- Formal training or certification in software engineering concepts with at least 3 years of applied experience </p><p><br/></p><p>- Hands-on practical experience in system design, application development, testing, and operational stability </p><p><br/></p><p>- Proficiency in coding in Java & Spark </p><p><br/></p><p>- Knowledge of GKP & AWS Databricks.<br/><br/>- Experience in developing, debugging, and maintaining code in a corporate environment with modern programming languages </p><p><br/></p><p>- Overall understanding of the Software Development Life Cycle </p><p><br/></p><p>- Solid grasp of agile methodologies such as CI/CD, Application Resiliency, and Security.

<br/><br/>- Demonstrated knowledge of software applications and technical processes within a technical discipline<br/><br/>- At least 3+ years of experience and strong knowledge in Scala programming language.<br/><br/>- Work independently on assigned tasks and take complete ownership from design to deployment.<br/><br/>- Collaborate with cross-functional teams for requirements gathering, design discussions, and delivery planning.<br/><br/>- Guide and mentor junior resources in coding best practices and problem-solving.<br/><br/>- Work with RDBMS and NoSQL databases to design and optimize storage solutions.<br/><br/>- Integrate applications with messaging services like Kafka and MQ for event-driven architectures.<br/><br/>- Ensure smooth deployments through CI/CD pipelines using Jenkins and Docker.<br/><br/>- Able to write clean, maintainable and efficient Scala code following best practices.<br/><br/>- Good knowledge on the fundamental Data Structures and their usage<br/><br/>- At least 2+ years of experience in designing and developing large scale, distributed data processing pipelines using Apache Spark and related technologies.<br/><br/>- Having expertise in Spark Core, Spark SQL and Spark Streaming.<br/><br/>- Experience with Hadoop, HDFS, Hive and other BigData technologies.<br/><br/>- Familiarity with Data warehousing and ETL concepts and techniques<br/><br/>- Having expertise in Database concepts and SQL/NoSQL operations.<br/><br/>- UNIX shell scripting will be an added advantage in scheduling/running application jobs.<br/><br/>- At least 3 years of experience in Project development life cycle activities and maintenance/support projects.<br/><br/>- Work in an Agile environment and participation in scrum daily standups, sprint planning reviews and retrospectives.<br/><br/>- Understand project requirements and translate them into technical solutions which meets the project quality standards<br/><br/>- Ability to work in team in diverse/multiple stakeholder environment and collaborate with upstream/downstream functional teams to identify, troubleshoot and resolve data issues.<br/><br/>- Strong problem solving and Good Analytical skills.<br/><br/>- Excellent verbal and written communication skills.<br/><br/>- Experience and desire to work in a Global delivery environment.<br/><br/>- Stay up to date with new technologies and industry trends in Development.</p> (ref:hirist.tech)


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Spark amp Potential: Insight & Career Growth Guide