Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Data Engineer.
India Jobs Expertini

Urgent! Data Engineer Job Opening In Gurugram – Now Hiring AXA Group



Job description

Data Engineer Gurgaon/Bangalore, India

AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities.

This data should not only be high quality, but also actionable – enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage.

Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model – disrupting the insurance market.

As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking an Data Engineer.

The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines.This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner.

DISCOVERyour opportunity

What will your essential responsibilities include?

·Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate.

·Understand current and future data consumption patterns, architecture (granular level), partner with Architects to ensure optimal design of data layers.



·Apply best practices in Data architecture.

For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning.

·Leading and hands-on execution of research into new technologies.

Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers.

·Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform.



·Design prototypes and work in a fast-paced iterative solution delivery model.

·Design, Develop and maintain ETL pipelines using Pyspark in Azure Databricks using delta tables.

·Use Harness for deployment pipeline.

·Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed.

·Diagnose system performance issue related to data processing and implement solution to address them.

·Collaborate with other teams to ensure successful integration of data pipelines into larger system architecture requirement.

·Maintain integrity and quality across all pipelines and environments.

·Understand and follow secure coding practice to make sure code is not vulnerable.

You will report to the Application Manager.

SHARE your talent

We’re looking for someone who has these abilities and skills:

Required Skills and Abilities:

·Effective Communication skills.

·Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience.

·Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills.

·Relevant years of programming experience using Databricks.

·Relevant years of experience using Microsoft Azure suite of products (ADF, synapse and ADLS).

·Solid knowledge on network and firewall concepts.

·Solid experience writing, optimizing and analyzing SQL.

·Relevant years of experience with Python.

·Ability to break complex data requirements and architect solutions into achievable targets.

·Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile.

·Experience using Harness.

·Technical lead responsible for both individual and team deliveries.

Desired Skills and Abilities:

·Worked in big data migration projects.

·Worked on performance tuning both at database and big data platforms.

·Ability to interpret complex data requirements and architect solutions.

·Distinctive problem-solving and analytical skills combined with robust business acumen.

·Excellent basics on parquet files and delta files.

·Effective Knowledge of Azure cloud computing platform.

·Familiarity with Reporting software – Power BI is a plus.

·Familiarity with DBT is a plus.

·Passion for data and experience working within a data-driven organization.

·You care about what you do, and what we do.

FIND your future

AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks.

For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it.

How?

By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty.

With an innovative and flexible approach to risk solutions, we partner with those who move the world forward.

Inclusion & Diversity


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Data Engineer Potential: Insight & Career Growth Guide