Job Description
<p><p><b>Description :</b><br/><br/><b>About the Role :</b><br/><br/>We are seeking a highly skilled Senior Backend Engineer with specialized expertise in data migration and complex systems integration.
This role is pivotal in ensuring a seamless and reliable transition of high-value automation data from external platforms into our core system.
The ideal candidate will possess deep technical knowledge of large-scale backend architecture, robust data pipelines, and practical experience with reliable error handling and asynchronous processing in a high-throughput :</b><br/><br/>- Architect and build an importer capable of migrating complex, high-value automation systems reliably and at extreme scale, focusing on performance, atomicity, and resilience during bulk operations.<br/><br/>- Design and implement services to parse complex external APIs, including handling rate limits, authentication (e.g., OAuth 2.0), and versioning, and transforming the ingested raw data into clear, usable, and normalized domain structures within our system.<br/><br/>- Ensure all data import processes are robust, idempotent, error-free, and handle large-volume, bulk operations smoothly using optimized batching and transaction strategies.<br/><br/>- Develop monitoring, logging, and alerting systems specifically for the data migration pipeline to provide real-time visibility into migration health, success rates, and identify data integrity issues instantly.<br/><br/>- Collaborate closely with product and engineering teams to define migration strategies, establish data mapping rules, and deliver a frictionless and reliable migration experience for end-users and customers.<br/><br/>- Drive engineering excellence by performing rigorous code reviews, defining best practices for integration testing, and advocating for scalable, maintainable system design across the backend services.<br/><br/><b>Requirements :</b><br/><br/>- Strong fundamental engineering skills encompassing system architecture design, implementation of clean, testable, and maintainable code (e.g., using principles like SOLID), and comprehensive unit/integration/end-to-end testing strategies.<br/><br/>- Demonstrated ability to design and deliver reliable, high-throughput data pipelines and integrations, understanding the challenges of eventual consistency and distributed transactions.<br/><br/>- Deep database knowledge in both SQL and NoSQL environments, specifically MongoDB and/or Firestore, including schema design, performance tuning, indexing, and complex query optimization for large datasets.<br/><br/>- Proven expertise handling complex external API integrations and performing non-trivial data transformation (ETL/ELT) processes efficiently and correctly.<br/><br/>- Strong knowledge of message queues such as Pub/Sub or similar systems, and experience with distributed processing frameworks (e.g., event-driven architecture) and effective management of asynchronous processing with systems like Redis.<br/><br/>- Practical experience in designing and implementing reliable error handling at scale, including retry mechanisms, dead-letter queues, and graceful degradation strategies to maintain system stability under load.<br/><br/>- Excellent communication skills and proven ability to collaborate effectively across cross-functional teams (Product, Design, DevOps) and clearly advocate for and document technical solutions and engineering best practices.<br/><br/><b>Preferred Skills :</b><br/><br/>- Expertise in a relevant backend language such as Go (Golang) or Node.js/TypeScript in a high-performance setting.<br/><br/>- Prior experience with data governance, security, and compliance requirements related to sensitive customer data.<br/><br/>- Experience in a regulated industry or with systems requiring high levels of data integrity and transactional guarantees.</p><br/></p> (ref:hirist.tech)