Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Senior Data Engineer.
India Jobs Expertini

Urgent! Senior Data Engineer Job Opening In Mumbai – Now Hiring Confidential

Senior Data Engineer



Job description

We are hiring a Senior Data Engineer to join our team.

At Kroll, we are building a strong, forward-looking data practice that integrates artificial intelligence, machine learning, and advanced analytics.

You will be designing, building, and integrating data pipelines from diverse sources and collaborate with teams that serve the world s largest financial institutions, law enforcement bodies, and government agencies.

This role will partner with and have a dotted-line relationship with the Alternative Asset Advisory service line, supporting their strategic data initiatives and client delivery goals.

The day-to-day responsibilities include but not limited to:

  • Design and build robust, scalable organizational data infrastructure and architecture.
  • Identify and implement process improvements (e.

    g.

    , infrastructure redesign, automation of data workflows, performance optimizations)
  • Select appropriate tools, services, and technologies to build resilient pipelines for data ingestion, transformation, and distribution.
  • Develop and manage ELT/ETL pipelines and related applications.
  • Collaborate with global teams to deliver fault-tolerant, high-quality data engineering solutions.
  • Perform monthly code quality audits and peer reviews to ensure consistency, readability, and maintainability across the engineering codebase.

Requirements:

  • Proven experience building and managing ETL/ELT pipelines.
  • Advanced proficiency with Azure AWS , and Databricks (with focus on data services)
  • Deep knowledge of Python Spark ecosystem (PySpark, Spark SQL) and relational databases
  • Experience building REST APIs , Python SDKs, libraries, and Spark-based data services.
  • Hands-on expertise with modern frameworks and tools like FastAPI Pydantic Polars Pandas Delta Lake Docker Kubernetes
  • Understanding of Lakehouse architecture Medallion architecture , and data governance
  • Experience with pipeline orchestration tools (e.

    g.

    , Airflow, Azure Data Factory)
  • Strong communication skills, ability to work cross-functionally with international teams.
  • Skilled in data profiling, cataloging, and mapping for technical data flows
  • Understanding of API product management principles , including lifecycle strategy, documentation standards, and versioning

Desired Skills:

  • Deep understanding of cloud architecture (compute, storage, networking, security, cost optimization)
  • Experience tuning complex SQL/Spark queries and pipelines for performance.
  • Hands-on experience building Lakehouse solutions using Azure Databricks ADLS PySpark , etc.
  • Familiarity with OOP , asynchronous programming, and batch processing paradigms
  • Experience with CI/CD Git , and DevOps best practices


Skills Required
Azure, Aws, Databricks, Python


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Senior Data Potential: Insight & Career Growth Guide