Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Data Integration Engineer.
India Jobs Expertini

Urgent! Data Integration Engineer Job Opening In Bengaluru – Now Hiring Grizmo Labs Private Limited

Data Integration Engineer



Job description

<p><p><b>Description :</b><br/><br/>Position : Data Integration Engineer<br/><br/>Experience : 2-5 Years<br/><br/>Location : Bangalore<br/><br/>Industry Type : Logistics / Supply Chain Technology (Assumed)<br/><br/>Education : Bachelor's degree in Computer Science or a related field.<br/><br/><b>Job Summary :</b><br/><br/>We are seeking a proactive Data Integration Engineer with 2-5 years of hands-on experience in building and optimizing robust data pipelines.

The role requires strong proficiency in Python (including Pandas/NumPy), an excellent grasp of data structures and algorithms, and practical experience with ETL/data integration workflows.

The Engineer will be responsible for designing and managing data exchange across various formats (JSON/XML, CSV, EDI) and sources (REST APIs, relational/NoSQL databases), with mandatory experience in workflow orchestration using Apache Airflow.<br/><br/><b>Job Description :</b><br/><br/><b>Data Pipeline Design and Development (ETL) :</b><br/><br/>- Design, build, and maintain robust ETL/data pipelines or data integration workflows to ensure the seamless, high-volume flow of data between internal and external systems.<br/><br/>- Demonstrate mandatory Proficiency in Python, utilizing related libraries such as Pandas and NumPy for complex data manipulation, cleansing, and transformation tasks.<br/><br/>- Apply a strong understanding of object-oriented programming (OOP) principles, data structures, and algorithms to develop scalable and efficient code.<br/><br/>- Ensure the quality and reliability of integrated data by implementing rigorous testing and error handling within all ETL processes.<br/><br/><b>Data Management and Integration Technologies :</b><br/><br/>- Possess comprehensive knowledge of both relational and non-relational databases, including expertise in data modeling and normalization techniques to optimize storage and retrieval efficiency.<br/><br/>- Work hands-on with various data formats, including structured files like JSON and XML, and flat file formats such as CSV and EDI (Electronic Data Interchange), which are critical for the logistics domain.<br/><br/>- Develop and manage data retrieval mechanisms by working extensively with REST APIs for integrating external partner and client data sources.<br/><br/>- Create, schedule, and manage complex workflows using Apache Airflow to orchestrate pipelines, manage task dependencies, and monitor job execution effectively.<br/><br/><b>Collaboration and System Optimization :</b><br/><br/>- Analyze data flow requirements and perform detailed impact analysis for changes to upstream and downstream systems.<br/><br/>- Demonstrate the ability to collaborate effectively within teams and exhibit the discipline to work independently when necessary to meet project deadlines.<br/><br/>- Contribute to system optimization efforts, focusing on improving the performance, resilience, and cost-effectiveness of data integration infrastructure.<br/><br/><b>Required Skills & Qualifications :</b><br/><br/>- Experience : Mandatory 2-5 years of experience in data engineering or integration.<br/><br/>- Core Programming : Proficiency in Python and related libraries (Pandas and NumPy).<br/><br/>- Fundamentals : Strong understanding of object-oriented programming, data structures, and algorithms.<br/><br/>- Databases : Knowledge of relational and non-relational databases, data modeling, and normalization.<br/><br/>- Pipelines : Hands-on experience designing and building ETL/data pipelines or data integration workflows.<br/><br/>- Orchestration : Experience creating workflows using Apache Airflow.<br/><br/>- Formats/APIs : Experience with structured (JSON/XML) and flat files (CSV, EDI), and working with REST APIs.<br/><br/>- Education : Bachelor's degree in Computer Science or a related field.<br/><br/><b>Preferred Skills :</b><br/><br/>- Cloud : Exposure to AWS or other cloud platforms (e.g., Azure, GCP) and cloud-native data services.<br/><br/>- Client Engagement : Prior client-facing experience for requirement gathering and technical discussion.<br/><br/>- Domain : Familiarity with data standards and integration challenges within the Logistics or Supply Chain domain.</p><br/></p> (ref:hirist.tech)


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Data Integration Potential: Insight & Career Growth Guide