Big Data Developer - Spark/Hadoop
We are seeking an experienced Big Data Developer with expertise in Hadoop, Spark, and Kafka to take complete ownership of the software development lifecycle, from requirement gathering to final deployment.
This role is ideal for a proactive individual who can translate business use cases into robust, scalable, and efficient big data solutions.
You'll work across multiple projects, coordinate with QA/support teams, and ensure high-quality deliverables.
Key Responsibilities
- Stakeholder Engagement: Engage with stakeholders to understand use cases and translate them into functional and technical specifications (FSD & TSD).
- Solution Implementation: Implement scalable, efficient big data solutions.
- Cross-Project Coordination: Work across multiple projects, coordinating with QA/support engineers for test case preparation.
- Quality Assurance: Ensure deliverables meet high-quality standards.
- SQL & Code Optimization: Write and validate SQL queries and develop optimized code for data processing workflows.
- Testing & Documentation: Write unit tests and maintain documentation to ensure code quality and maintainability.
- Job Scheduling & Automation: Schedule and automate jobs via shell scripts.
- Performance Tuning: Optimize performance and resource usage in a distributed system, writing production-grade code.
- Communication & Collaboration: Effectively coordinate with business users, developers, and testers, and manage dependencies across teams.
Key Skills Required
Must Have:
- Hadoop
- Spark (core & streaming)
- Hive
- Kafka
- Shell Scripting
- SQL
- TSD/FSD documentation
Good to Have:
- Airflow
- Scala
- Cloud platforms (AWS/Azure/GCP)
- Agile methodology
General Requirements
- Employment Type: Permanent
- Notice Period: Immediate or Max 15 days
- Strong analytical and problem-solving skills.
- Ability to work in a dynamic, agile environment.
This role is both technically challenging and rewarding, offering the opportunity to work on large-scale, real-time data processing systems.
Skills Required
Hadoop, Spark, Hive, Kafka, Shell Scripting, Sql