Job Description : We are seeking a motivated and skilled Kafka DevOps Engineer to join our team. The ideal candidate will be responsible for designing and implementing data pipelines using Apache Kafka. This role requires expertise in Kafka platform management, integration with various data sources and sinks, and ensuring data integrity and security. Roles & Responsibilities : - Design and implement data pipelines using Apache Kafka.- Collaborate with other development teams to integrate Kafka with various data sources and sinks.- Develop streaming applications using Kafka Streams and Kafka Connect.- Ensure data integrity and security by implementing proper data access and encryption methods.- Troubleshoot and optimize Kafka clusters for performance and reliability.- Create Kafka-based applications using KTables, KStreams, KSQL, and SMTs.- Create and update Kafka connectors.- Design, develop, and maintain Kafka-based data streaming solutions, including producers, consumers, connectors, and Kafka Streams applications.Skills & Requirements : - Strong experience with Confluent Kafka and Confluent Cloud.- Proficiency in Kafka Streams, Kafka Connect, and Kafka ecosystem components.- Experience with Terraform and Kubernetes.- Certification in Kafka, such as Confluent Certified Developer or similar, is preferred.- Ability to design, develop, and maintain Kafka-based data streaming solutions.- Prior experience in DevOps roles with a focus on Kafka platform management.- Excellent problem-solving skills and the ability to troubleshoot and optimize Kafka clusters.- Strong communication skills and the ability to collaborate effectively across teams.Educational Qualifications : Bachelor's/Master's (ref:hirist.tech)