Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Senior Data Engineer/Architect.
India Jobs Expertini

Urgent! Senior Data Engineer/Architect Job Opening In Bengaluru – Now Hiring Infosys

Senior Data Engineer/Architect



Job description

As a Software Developer you will work in a constantly evolving environment, due to technological advances and the strategic direction of the organization you work for.

You will create, maintain, audit, and improve systems to meet particular needs, often as advised by a systems analyst or architect, testing both hard and software systems to diagnose and resolve system faults.

The role also covers writing diagnostic programs and designing and writing code for operating systems and software to ensure efficiency.

When required, you will make recommendations for future developments

Benefits of Joining Us


  • Challenging Projects : Work on cutting-edge projects and solve complex technical problems.

  • Career Growth : Advance your career quickly and take on leadership roles.

  • Mentorship : Learn from experienced mentors and industry experts.

  • Global Opportunities : Work with clients from around the world and gain international experience.

  • Competitive Compensation : Receive attractive compensation packages and benefits.

  • If you're passionate about technology and want to work on challenging projects with a talented team, becoming an Infosys Power Programmer could be a great career choice.



Mandatory Skills

  • AWS Glue, AWS Redshift/Spectrum, S3, API Gateway, Athena, Step and Lambda functions
  • Experience in Extract Transform Load (ETL) and Extract Load & Transform (ELT) data integration pattern.

  • Experience in designing and building data pipelines.

  • Development experience in one or more object-oriented programming languages, preferably Python


Job Specs

  • 5+ years of in depth hands on experience of developing, testing, deployment and debugging of Spark Jobs using Scala in Hadoop Platform
  • In depth knowledge of Spark Core, working with RDDs, Spark SQL
  • In depth knowledge on Spark Optimization Techniques and Best practices
  • Good Knowledge of Scala Functional Programming: Try, Option, Future, Collections
  • Good Knowledge of Scala OOPS: Classes, Traits and Objects (Singleton and Companion), Case Classes
  • Good Understanding of Scala Language Features: Type System, Implicit/Givens
  • Hands on experience of working in Hadoop Environment (HDFS/Hive), AWS S3, EMR
  • Python programming skills
  • Working experience on Workflow Orchestration tools like Airflow, Oozie
  • Working with API calls in Scala
  • Understanding and exposure to file formats such as Apache AVRO, Parquet, JSON
  • Good to have knowledge of Protocol Buffers and Geospatial data analytics.

  • Writing Test cases using frameworks such as scalatest.

  • Good Knowledge of Build Tools such as: Gradle & SBT in depth
  • Experience on using GIT, resolving conflicts, working with branches.

  • Good to have worked on some workflow systems as Airflow
  • Strong programming skills using data structures and algorithms.

  • Excellent analytical skills
  • Good communication skills

Qualification

7-10 Yrs in the industry

BE/B.tech CS or equivalent


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Senior Data Potential: Insight & Career Growth Guide