Job Description:
We are looking for a skilled Data Engineer with 5-8 Years of experience with strong expertise in data integration, ETL pipelines, and cloud infrastructure.
The ideal candidate will be proficient in SQL, Python, and MongoDB, with hands-on experience in building scalable data pipelines and working across multiple databases.
The role requires a platform-agnostic mindset with exposure to AWS services, messaging systems, and monitoring tools.
The selected candidate will be working at our client site in Delhi, and this is a Work From Office (WFO) opportunity.
Experience- 5-8 years
Location - Delhi
Key Responsibilities:
- Design, develop, and maintain ETL pipelines and database schemas to support business and analytics needs.
- Work with multi-database architectures (SQL, NoSQL, MongoDB) ensuring scalability and efficiency.
- Deploy and manage AWS resources such as Lambda functions and EC2 instances.
- Integrate and optimize streaming/messaging frameworks such as Kafka and caching systems like Redis.
- Collaborate with cross-functional teams to ensure seamless data flow across platforms.
- Monitor infrastructure and system performance using tools such as Grafana, CloudWatch, or equivalent monitoring solutions.
- Ensure data quality, security, and compliance standards are consistently maintained.
Required Skills & Experience:
- Strong programming experience in SQL, Python, and MongoDB.
- Proven experience in building and managing ETL pipelines.
- Ability to work in a platform-agnostic environment.
- Hands-on experience with AWS services (Lambda, EC2).
- Exposure to Kafka / Redis.
- Experience with monitoring tools (Grafana, CloudWatch, etc.).
- Strong problem-solving skills and ability to work in a fast-paced environment.