Job Overview
Category
Computer Occupations
Ready to Apply?
Take the Next Step in Your Career
Join PibyThree and advance your career in Computer Occupations
Apply for This Position
Click the button above to apply on our website
Job Description
<p><p><b>Description :</b><br/><br/><b>About the job :</b><br/><br/><b>Key Responsibilities :</b><br/><br/>- Lead the design, development, and optimization of end-to-end data pipelines and ETL workflows.<br/><br/>- Architect data integration solutions leveraging Snowflake, Informatica PowerCenter, and Teradata.<br/><br/>- Collaborate with business and technical teams to gather requirements and translate them into technical designs.<br/><br/>- Drive performance tuning, error handling, and automation across ETL and data warehouse layers.<br/><br/>- Provide technical leadership to developers, mentor junior engineers, and enforce best practices.<br/><br/>- Ensure data quality, consistency, and security across data environments.<br/><br/>- Support migration and modernization initiatives from legacy systems to cloud-based data platforms.<br/><br/>- Participate in Agile/Scrum ceremonies, sprint planning, and code reviews.<br/><br/><b>Required Skills & Experience :</b><br/><br/><b>Core Expertise (Must-Have) :</b><br/><br/>- Snowflake (3+ years) : Hands-on experience with SnowSQL, Time Travel, cloning, query profiling and optimization, and secure Data Sharing.
Able to design schemas and implement performance-tuned solutions.<br/><br/>- Informatica PowerCenter (3+ years) : Proficient with Designer, Workflow Manager/Monitor, mappings and transformations, session/workflow tuning, error handling and deployments.<br/><br/>- Teradata (3+ years) : Strong SQL skills including BTEQ scripting, stored procedures, performance-tuned joins, indexing/collect stats, and utilities such as FastLoad, MultiLoad, TPT.<br/><br/>- SQL & Data Modeling : Demonstrated ability to design normalized and dimensional models (3NF, Star, Snowflake) and write complex, performance-oriented SQL for analytics.<br/><br/><b>Required Supporting Skills :</b><br/><br/>- Shell scripting (Bash/Ksh) : Practical experience automating ETL jobs, log handling, and job orchestration (typically 2+ years).<br/><br/>- Python for data engineering : Proficient in writing production-quality Python scripts for ETL and data processing (typically 2+ years).
</p><p><br/></p><p>- Familiar with commonly used libraries (e.g., pandas, SQLAlchemy), exception handling, basic unit testing, and performance considerations.<br/><br/><b>Preferred / Nice-to-Have :</b><br/><br/>- Experience with CI/CD and version control : Git (branching strategies) and building/maintaining pipelines (Jenkins, GitLab CI, Azure DevOps).<br/><br/>- Familiarity with cloud data platforms and migrations (AWS/GCP/Azure) and cloud-native storage/compute services.<br/><br/>- Experience with orchestration tools (Apache Airflow, Control-M) and monitoring/alerting solutions.<br/><br/>- Prior experience working in Agile/Scrum teams, performing code reviews, and mentoring team members.<br/><br/><b>Skills :</b> (ref:hirist.tech)
Don't Miss This Opportunity!
PibyThree is actively hiring for this Technical Lead - Python/ETL position
Apply Now