Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Data Pipeline Engineer ElasticSearch.
India Jobs Expertini

Urgent! Data Pipeline Engineer - ElasticSearch Job Opening In Bengaluru – Now Hiring Dexian

Data Pipeline Engineer ElasticSearch



Job description

<p><p><b>Overview :</b></p><p><p><b><br/></b></p>We are seeking a skilled and motivated Data Pipeline Engineer to join our team.<br/>In this role, you will manage and maintain critical data pipeline platforms that collect, transform, and transmit cyber events data to downstream platforms, such as ElasticSearch and Splunk.</p><p><br/>You will be responsible for ensuring the reliability, scalability, and performance of the pipeline infrastructure while building complex integrations with cloud and on-premises cyber systems.<br/>Our key stakeholders are cyber teams including security response, investigations and insider threat.<br/><br/><b>Role Profile : </b></p><p><p><b><br/></b></p>A successful applicant will contribute to several important initiatives including :</p><p> <br/>- Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.<br/><br/></p><p>- Design and implement data mapping, transformation, and routing processes to meet </p><p>analytics and monitoring requirements.<br/><br/></p><p>- Developing automation tools that integrate with in-house developed configuration </p><p>management frameworks and APIs.<br/><br/></p><p>- Monitor the health and performance of the data pipeline infrastructure.<br/><br/></p><p>- Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issues.</p><p><br/></p><p>- Create and maintain detailed documentation for pipeline architecture, processes, and integrations.<br/><br/><b>Required Skills : </b></p><p><br/></p><p>- Hands-on experience deploying and managing large-scale dataflow products like Cribl, </p><p>Logstash or Apache NiFi.<br/><br/></p><p>- Hands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.</p><p><br/></p><p>- Hands-on experience in developing and validating field extraction using regular expressions.<br/><br/></p><p>- A solid understanding of Operating Systems and Networking concepts : Linux/Unix system administration, HTTP and encryption.<br/><br/></p><p>- Good understanding of software version control, deployment & build tools using DevOps </p><p>SDLC practices (Git, Jenkins, Jira).<br/><br/></p><p>- Strong analytical and troubleshooting skills.<br/><br/></p><p>- Excellent verbal & written communication skills.<br/><br/></p><p>- Appreciation of Agile methodologies, specifically Kanban.<br/><br/><p><b>Desired Skills : </b></p><p><br/></p><p>- Enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQ.</p><p><br/></p>- Infrastructure automation and integration experience, ideally using Python and Ansible.<br/><br/></p><p>- Familiarity with cybersecurity concepts, event types, and monitoring requirements.<br/><br/></p><p>- Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema </p><p>(ECS).</p><br/></p> (ref:hirist.tech)


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Data Pipeline Potential: Insight & Career Growth Guide