Job Description
<p><p><b>Description :</b><br/><br/><b>Key Responsibilities :</b></p><p><p><b><br/></b></p>- Design, develop, and implement ETL workflows and data pipelines using Informatica IICS / IDMC.<br/><br/>- Perform data extraction, transformation, and loading (ETL) from multiple source systems into data warehouses (e.g., Snowflake, Teradata, Oracle).<br/><br/>- Lead data migration and integration initiatives, ensuring scalability, performance, and data integrity.<br/><br/>- Collaborate with data architects, analysts, and business users to translate requirements into technical specifications.<br/><br/>- Develop and maintain complex SQL queries, stored procedures, and data validation scripts.<br/><br/>- Apply data warehousing concepts such as star schema, dimensional modeling, and ETL best practices.<br/><br/>- Conduct performance tuning, error handling, and troubleshooting across Informatica and database layers.<br/><br/>- Work closely with DevOps and Cloud teams to support deployment and automation processes (CI/CD).<br/><br/>- Participate in code reviews, documentation, and ensure compliance with enterprise data governance standards.<br/><br/>- Stay updated on emerging technologies and best practices in cloud data integration and modern ETL frameworks.<br/><br/><b>Qualifications & Skills :</b><br/><br/><b>Required :</b></p><p><p><b><br/></b></p>- Bachelor's or Masters degree in Computer Science, IT, or related field.<br/><br/>- 5+ years of hands-on ETL/Informatica IICS/IDMC development experience.<br/><br/>- Strong SQL proficiency and experience with relational DBs (Oracle, Teradata, Snowflake).<br/><br/>- Demonstrated experience in data migration to warehouses (e.g. Snowflake).<br/><br/>- Solid understanding of data warehousing, star schema, and modern ETL architecture patterns.<br/><br/>- Excellent problem-solving, performance tuning, and troubleshooting skills.<br/><br/>- Strong analytical, communication, and stakeholder collaboration abilities.<br/><br/><b>Preferred / Nice-to-Have :</b></p><p><p><b><br/></b></p>- Familiarity with cloud platforms : AWS, Azure, or GCP.<br/><br/>- Experience with scheduling tools (e.g., Control-M), CI/CD, scripting (Python, Unix shell), REST APIs, JSON/XML.<br/><br/>- Familiarity with data governance, metadata management (e.g., EDC, Axon), or data quality tools.<br/><br/>- Prior background in the insurance, banking, or financial services domain is a plus.</p><br/></p> (ref:hirist.tech)