Responsibilities include:
- Integrate machine learning workflows with data pipelines and analytics tools.
 - Define data governance frameworks and manage data lineage.
 - Lead data modeling efforts to ensure consistency, accuracy, and performance across systems.
 - Optimize cloud infrastructure for scalability, performance, and reliability.
 - Mentor junior team members and ensure adherence to architectural standards.
 - Collaborate with DevOps teams to implement Infrastructure as Code (Terraform, Cloud Deployment Manager).
 - Ensure high availability and disaster recovery solutions are built into data systems.
 - Conduct technical reviews, audits, and performance tuning for data solutions.
 - Design solutions for multi-region and multi-cloud data architecture.
 - Stay updated on emerging technologies and trends in data engineering and GCP.
 - Drive innovation in data architecture, recommending new tools and services on GCP.
 
Primary Skills :
- 7+ years of experience in data architecture, with at least 3 years in GCP environments.
 - Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services.
 - Strong experience in data warehousing, data lakes, and real-time data pipelines.
 - Proficiency in SQL, Python, or other data processing languages.
 - Experience with cloud security, data governance, and compliance frameworks.
 - Strong problem-solving skills and ability to architect solutions for complex data environments.
 - Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred.
 - Leadership experience and ability to mentor technical teams.
 - Excellent communication and collaboration skills.
 
Skills Required
Data Modeling, Sql, Python, Performance Tuning, Machine Learning, Data Architecture, Analytics