We are seeking a hands-on Python AWS Data Engineer  to design, develop, and maintain robust ETL pipelines and cloud-based data solutions.
The ideal candidate will have strong expertise in Python, Spark, SQL , and AWS services , with experience in building scalable data pipelines, implementing data transformations, and ensuring high-quality, maintainable code.
 
You will collaborate closely with data architects, analysts, and business stakeholders to enable data-driven decision-making while working in an Agile environment .
 
Design, develop, and maintain ETL pipelines  using Python, Spark , and AWS services.
 
Build, test, debug, and document Python and Spark applications .
 
Work with AWS services  including S3, ECS, Lambda, Glue, DynamoDB, SQS, SNS, IAM, and Athena .
 
Write and optimize SQL queries, joins, and stored procedures  for relational databases.
 
Implement unit tests  and validation using Pytest  or Unittest  frameworks.
 
Collaborate with DevOps teams to maintain CI/CD pipelines  using Git, Jenkins/Bamboo .
 
Ensure scalability, performance, and reliability of data pipelines.
 
Participate in Agile ceremonies and contribute to sprint planning, code reviews, and retrospectives.
 
Document solutions, data flows, and ETL transformations for knowledge sharing and compliance.
 
3–8 years  of hands-on experience in Python, Spark, SQL , and ETL development.
 
Strong experience with AWS cloud services  and cloud-native data solutions.
 
Proficiency in ETL architecture  and pipeline design.
 
Experience with unit testing frameworks  like Pytest and Unittest.
 
Solid understanding of RDBMS, SQL queries, joins, stored procedures .
 
Exposure to CI/CD  and build automation tools (Git, Jenkins/Bamboo).
 
Familiarity with Bitbucket, GitHub, Jira, Splunk, PostgreSQL  is a plus.
 
Excellent verbal and written communication skills .
 
Ability to work in Agile/Scrum  environments.
 
Investment Management domain knowledge  is an added advantage.
 
Empowering innovators to build the future of digital enterprise.
 
Work on high-impact data engineering projects  using cutting-edge cloud technologies.
 
Collaborative, growth-focused environment that values innovation and ownership .
 
Competitive pay, hybrid work flexibility, and exposure to enterprise-scale data platforms .