Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Kafka Developer.
India Jobs Expertini

Urgent! Kafka Developer Job Opening In New Delhi – Now Hiring R Systems



Job description

Job Title : Kafka Developer

Location : Noida/ Chennai/ Pune

Education: B.E. / B.Tech.

Year of Experience: 8-10 Years

About us

At R Systems we are shaping the future of technology by designing cutting-edge software #products, platforms, and #digital experiences that drive business growth for our clients.

Our product mindset and advanced engineering capabilities in #Cloud, Data, #AI, and Customer Experience empower us to deliver innovative solutions to key players across the high-tech industry.

This includes ISVs, SaaS, and Internet companies, as well as leading organizations in #telecom, #media, #healthcare, #finance, and #manufacturing.

We are Great Place To Work Certified in 10 countries where we have a full-time workforce - India, the USA, Canada, Poland, Romania, Moldova, Indonesia, Singapore, Malaysia, and Thailand.

Mandatory Requirements:

- A degree in Computer Science, IT, or a related field.
- Kafka Expertise – Strong knowledge of Kafka architecture, brokers, producers, consumers, and stream processing.
- Programming Skills – Proficiency in Java, Scala, or Python for developing Kafka-based applications.
- Big Data & Streaming Technologies – Experience with Spark, Flink, or Apache Storm is a plus.
- Database Knowledge – Familiarity with SQL and NoSQL databases like Cassandra, MongoDB, or PostgreSQL.
- Cloud & DevOps – Experience with cloud platforms (AWS, Azure, GCP) and Kubernetes/Docker.
- Event-Driven Architecture – Understanding of event-driven and microservices architectures.
- Monitoring & Debugging – Experience with Kafka monitoring tools like Confluent Control Center, Kafka Manager, or ELK stack.
- Security & Scalability – Knowledge of Kafka security, access control, and scaling strategies.
- Problem-Solving & Communication – Strong analytical skills and ability to work in cross-functional teams

Roles & Responsibilities:

- Kafka Application Development – Design, develop, and maintain real-time data streaming applications using Apache Kafka.
- Topic Management – Create and manage Kafka topics, partitions, and replication factors.
- Data Pipeline Development – Build and optimize data pipelines for real-time data processing.
- Producer & Consumer Implementation – Develop Kafka producers and consumers for seamless data exchange between systems.
- Integration & Connectivity – Integrate Kafka with databases, cloud services, and other messaging platforms.
- Performance Optimization – Monitor and fine-tune Kafka clusters for low latency and high throughput.
- Security & Compliance – Implement security best practices, including SSL, SASL, and authentication mechanisms.
- Cluster Administration – Manage Kafka clusters, brokers, ZooKeeper, and ensure high availability.
- Monitoring & Logging – Use tools like Prometheus, Grafana, or Kafka Manager for monitoring and troubleshooting.
- Documentation & Best Practices – Maintain documentation for Kafka configurations, data flow, and architectural decisions.

Mandatory Skills

Java, Apache Kafka, Kafka Streams, Kafka, Apache Zookeeper, Kafka Cluster, Kafka Administrator, Microservices


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Kafka Developer Potential: Insight & Career Growth Guide